LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.

Slides:



Advertisements
Similar presentations
CPSCG: Constructive Platform for Specialized Computing Grid Institute of High Performance Computing Department of Computer Science Tsinghua University.
Advertisements

11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
AQS Futures (Where is AQS Heading?) AQS Conference August 22, 2012 Robert Coats.
Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
LIGO-G Z 23 October 2002NSF Review of LIGO Laboratory1 The Penn State LIGO Data Analysis Center Lee Samuel Finn Penn State.
DataGrid is a project funded by the European Union 22 September 2003 – n° 1 EDG WP4 Fabric Management: Fabric Monitoring and Fault Tolerance
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LIGO-G Z Camp orientation / overview K. Riles - University of Michigan 1 Orientation and Overview Keith Riles (University of Michigan)
Noise Floor Non-stationarity Monitor Roberto Grosso*, Soma Mukherjee + + University of Texas at Brownsville * University of Nuernburg, Germany Detector.
The middleware that makes real time integration a reality.
LIGO-G0200XX-00-M DMT Monitors: Beyond the FOM John Zweizig LIGO/Caltech LLO August 18, 2006.
Cross Cluster Migration Remote access support Adianto Wibisono supervised by : Dr. Dick van Albada Kamil Iskra, M. Sc.
November 2011 At A Glance GREAT is a flexible & highly portable set of mission operations analysis tools that increases the operational value of ground.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
LIGO-G DM. Landry – Amaldi5 July 9, 2003 Laser Interferometer Gravitational Wave Observatory Monitoring LIGO Data During the S2 Science Run Michael.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Patrick R Brady University of Wisconsin-Milwaukee
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
INFSO-RI Module 01 ETICS Overview Alberto Di Meglio.
© 2010 IBM Corporation IBM InfoSphere Streams Enabling a smarter planet Roger Rea InfoSphere Streams Product Manager Sept 15, 2010.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
INFSO-RI Module 01 ETICS Overview Etics Online Tutorial Marian ŻUREK Baltic Grid II Summer School Vilnius, 2-3 July 2009.
Introduction to the Adapter Server Rob Mace June, 2008.
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
TeraGrid CTSS Plans and Status Dane Skow for Lee Liming and JP Navarro OSG Consortium Meeting 22 August, 2006.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The LIGO Scientific Collaboration Data Grid Client/Server Environment Junwei Cao MIT LIGO Laboratory For the LIGO Scientific Collaboration.
LIGO-G D Global Diagnostics and Detector Characterization 9 th Marcel Grossmann Meeting Daniel Sigg, LIGO Hanford Observatory.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
LIGO-G Z 5 June 2001L.S.Finn/LIGO Scientific Collaboration1 LDAS Camp.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
Junwei Cao March LIGO Document ID: LIGO-G Computing Infrastructure for Gravitational Wave Data Analysis Junwei Cao.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
A Data/Detector Characterization Pipeline (What is it and why we need one) Soumya D. Mohanty AEI January 18, 2001 Outline of the talk Functions of a Pipeline.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
May 29, 2006 GWADW, Elba, May 27 - June 21 LIGO-G0200XX-00-M Data Quality Monitoring at LIGO John Zweizig LIGO / Caltech.
LIGO-G Z Detector Characterization Issues K. Riles - University of Michigan 1 Detector Characterization Issues -- S2 Analysis & S3 Planning.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI How to integrate portals with the EGI monitoring system Dusan Vudragovic.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
LIGO-G9900XX-00-M DMT Monitor Verification with Simulated Data John Zweizig LIGO/Caltech.
Online DQ Segments and Triggers John Zweizig LIGO/Caltech.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
Mobile Analyzer A Distributed Computing Platform Juho Karppinen Helsinki Institute of Physics Technology Program May 23th, 2002 Mobile.
MSF and MAGE: e-Science Middleware for BT Applications Sep 21, 2006 Jaeyoung Choi Soongsil University, Seoul Korea
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
EGEE is a project funded by the European Union under contract IST Issues from current Experience SA1 Feedback to JRA1 A. Pacheco PIC Barcelona.
G. Russo, D. Del Prete, S. Pardi Kick Off Meeting - Isola d'Elba, 2011 May 29th–June 01th A proposal for distributed computing monitoring for SuperB G.
Gravitational Wave Data Analysis  GW detectors  Signal processing: preparation  Noise spectral density  Matched filtering  Probability and statistics.
ASIS Status Report Bruce Allen LSC Meeting, LHO, August 16, 2001
Leigh Grundhoefer Indiana University
Data Path through host/ANP.
Module 01 ETICS Overview ETICS Online Tutorials
S4 Online Inspiral Search
M Ashley - LSC Meeting – LIGO Hanford, 12 Nov 2003
Software Implementation
Presentation transcript:

LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech

LIGO-G9900XX-00-M DMT Overview DMT – Detector characterization platform in C++ »Written for monitoring IFO state, sensitivity, noise, etc. »Signal processing, detector characterization class libraries. Online use (in LIGO control room) »Run continuously on ~10 Solaris SMP nodes (reference platform) »Stream of current IFO data received directly from DAQ. »Real-time, low latency response »Operator feedback: alarms, IFO state, sensitivity, noise/transient summaries »Output: real-time triggers (transient descriptors), Statistics trends

LIGO-G9900XX-00-M Online DMT Data Flow

LIGO-G9900XX-00-M Growing Use – Offline Analysis Offline analysis uses »Run “monitors” with different configurations on entire raw data samples. –Data quality investigations –Transient detection »Use detector specific codes in offline GW search pipelines. Impediments to offline use »Linux installation difficult (no binary distribution, no reference platform). –Impacts development as well as universal use »No standard offline environment defined »Unable to use LSC computational clusters »Global data access and publication not provided »Sensitive data – administrative nightmare »Local job submission only »Single data stream input (multiple inputs useful for IFO correlations). ITR2003 DMT sub-project will enable this use

LIGO-G9900XX-00-M Offline DMT Data Flow

LIGO-G9900XX-00-M ITR2003 Mission Grid enable DMT »Improve portability »Interface to Globus tools and LDR »Condor enable »Develop remote submission capabilities »Install on LSC clusters Without adversely affecting the primary DMT mission »Maintain Solaris compatibility »Transparent access to DAQ broadcast stream »Interface to ROOT Graphics »Operator feedback, Database Interfaces

LIGO-G9900XX-00-M Participants Erik Katsavounidis (MIT PI) Lee Samuel Finn (PSU PI) John G. Zweizig (Caltech) Junwei Cao (MIT, ITR2003) Anonymous Hire (PSU) Part time hire (UWM) »25% for deployment, other time as available

LIGO-G9900XX-00-M Technical Linux portability »Autoconfigure/automake »Rpm generation »Periodic build »Standardize offline runtime environment. »Decouple from CERN Root package Data access transparency »Interface to directory / LDR Running on the grid »Use Condor features (checkpointing, etc) »Process management »Remote submission

LIGO-G9900XX-00-M Technical 2 Maintaining clusters »Installation / maintenance »Usage prioritization. »Clusters at observatories & tier 2 centers. Validation »Component tests for nightly build Nuts ‘n Bolts »Multiple data streams »Rationalize dependencies (fftw-3, …) »Etc?

LIGO-G9900XX-00-M Stakeholders LSC »Extend DMT availability to LSC offline analysis. –Improve portability. –Formalize use of DMT on LSC Linux clusters –Enable remote submission. »Protect data access. Grid community »Demonstrate gridification of an existing Scientific software package »Grid middleware use in real scientific applications allows feedback to grid community e.g. which tools/features are useful/needed »Productive readjustment of priorities, more useful middleware

LIGO-G9900XX-00-M Plans (2004) Improve portability & versioning of DMT software »Implement build procedure using autoconfigure, automake. »Automate periodic builds & tests of current version. »Build & maintain binary RPMs (APTs?) for LSC reference platform. »Define standard execution environment. Interface DMT to grid utilities »Data access / publishing »User certification? Verification »Build library of component tests for periodic verification. Cluster support (MIT, PSU, UWM) »Installation

LIGO-G9900XX-00-M Plans (2005) Set up DMT to use Condor features »Disable message system (alarms, graphics) »Set up to run as “standard” Condor job Remote submission architecture »Wrapper/supervisor? for condor jobs »Web interface for submission. Cluster support (PSU, UWM, MIT)

LIGO-G9900XX-00-M Plans (2006-7) Continue development of remote submission architecture »Wrapper/supervisor? for condor jobs »Web interface for submission Cluster support (MIT, PSU, UWM)