11/12/2003LIGO Document G030554-00-Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.

Slides:



Advertisements
Similar presentations
Case Study 1: Data Replication for LIGO Scott Koranda Ann Chervenak.
Advertisements

WEB DESIGN TABLES, PAGE LAYOUT AND FORMS. Page Layout Page Layout is an important part of web design Why do you think your page layout is important?
Calibration of the gravitational wave signal in the LIGO detectors Gabriela Gonzalez (LSU), Mike Landry (LIGO-LHO), Patrick Sutton (PSU) with the calibration.
LIGO-G Z 23 October 2002NSF Review of LIGO Laboratory1 The Penn State LIGO Data Analysis Center Lee Samuel Finn Penn State.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Searching for correlations in global environmental noise Karl Wette, Susan Scott and Antony Searle ACIGA Data Analysis Centre for Gravitational Physics.
LIGO Reduced Data Sets E7 standard reduced data set RDS generation for future runs decimation examples LIGO-G Z Isabel Leonor (w/ Robert Schofield)
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Design considerations for the Indigo Data Analysis Centre. Anand Sengupta, University of Delhi Thanks to -Maria Alessandra Papa (AEI) -Stuart Anderson.
Chapter 10 Publishing and Maintaining Your Web Site.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
AQS Web Quick Reference Guide Changing Raw Data Values Using Maintenance 1. From Main Menu, click Maintenance, Sample Values, Raw Data 2. Enter monitor.
LSC Segment Database Duncan Brown Caltech LIGO-G Z.
Patrick R Brady University of Wisconsin-Milwaukee
G Z LIGO Scientific Collaboration Grid Patrick Brady University of Wisconsin-Milwaukee LIGO Scientific Collaboration.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
Software Software consists of the instructions issued to the computer to perform specific tasks. –The software on a computer system refers to the programs.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Installation and Upgrade Campus-Booster ID : **XXXXX Copyright © SUPINFO. All rights reserved Solaris 10 installation.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech.
28-29 th March 2006CCP4 Automation STAB MeetingCCP4i and Automation 1 CCP4i and Automation : Opportunities and Limitations Peter Briggs, CCP4.
1 AGRIDES Walk-through. 2 AGRIDES - File Content AGRIDES allows to upload one file per transaction:  File –Message 1 Document A –Message 2 Document B.
Part Four: The LSC DataGrid Part Four: LSC DataGrid A: Data Replication B: What is the LSC DataGrid? C: The LSCDataFind tool.
Apache JMeter By Lamiya Qasim. Apache JMeter Tool for load test functional behavior and measure performance. Questions: Does JMeter offers support for.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
LIGO-G D LSC Meeting Nov An Early Glimpse at the S3 Run Stan Whitcomb (LIGO–Caltech) LIGO Scientific Collaboration Meeting LIGO Hanford.
Grid Deployment Enabling Grids for E-sciencE BDII 2171 LDAP 2172 LDAP 2173 LDAP 2170 Port Fwd Update DB & Modify DB 2170 Port.
Automated Grid Monitoring for LHCb Experiment through HammerCloud Bradley Dice Valentina Mancinelli.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
Junwei Cao March LIGO Document ID: LIGO-G Computing Infrastructure for Gravitational Wave Data Analysis Junwei Cao.
LIGO-G Z Detector Characterization Summary K. Riles - University of Michigan 1 Summary of the Detector Characterization Sessions Keith.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
LIGO-G Z S2 and S3 calibration status Michael Landry LIGO Hanford Observatory for the Calibration team Justin Garofoli, Luca Matone, Hugh Radkins.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
Gregory Mendell, LIGO Hanford Observatory LIGO-G WLIGO-G WLIGO-G W LIGO S5 Reduced Data Set Generation March 2007.
LSC Meeting, March 22, 2002 Peter Shawhan (LIGO/Caltech)LIGO-G E Access to Data from the LIGO Engineering Runs Peter Shawhan (LIGO Lab / Caltech)
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
May 29, 2006 GWADW, Elba, May 27 - June 21 LIGO-G0200XX-00-M Data Quality Monitoring at LIGO John Zweizig LIGO / Caltech.
LIGO-G Z SenseMonitor Updates for S3 Patrick Sutton LIGO-Caltech.
FTS monitoring work WLCG service reliability workshop November 2007 Alexander Uzhinskiy Andrey Nechaevskiy.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
GridView - A Monitoring & Visualization tool for LCG Rajesh Kalmady, Phool Chand, Kislay Bhatt, D. D. Sonvane, Kumar Vaibhav B.A.R.C. BARC-CERN/LCG Meeting.
LIGO-G Z Detector Characterization Summary K. Riles - University of Michigan 1 Summary of the Detector Characterization Sessions Keith.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Mario Reale – GARR NetJobs: Network Monitoring Using Grid Jobs.
LIGO-G Z Report on the S2 RunK. Riles / S. Whitcomb 1 Report on the S2 Run Keith Riles (University of Michigan) Stan Whitcomb (LIGO–Caltech)
Can you do this in SmarTeam?
EGEE is a project funded by the European Union under contract IST Issues from current Experience SA1 Feedback to JRA1 A. Pacheco PIC Barcelona.
LSC Z000.0 LSC at LLO LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software Coordinator Report Alan Wiseman University.
The bulk dataTransfer procedures (CNAF, Nov 2015) L. Salconi, A. Bozzi.
Report on the FCT MDC Stuart Anderson, Kent Blackburn, Philip Charlton, Jeffrey Edlund, Rick Jenet, Albert Lazzarini, Tom Prince, Massimo Tinto, Linqing.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Data Acquisition, Diagnostics & Controls (DAQ)
Simulation Production System
Gregory Mendell LIGO Hanford Observatory
Maximum Availability Architecture Enterprise Technology Centre.
Overview of E10 / S3 Hardware Injections
GLAST Release Manager Automated code compilation via the Release Manager Navid Golpayegani, GSFC/SSAI Overview The Release Manager is a program responsible.
Report on the S2 Run Keith Riles (University of Michigan)
Alice Software Demonstration
S4 Online Inspiral Search
M Ashley - LSC Meeting – LIGO Hanford, 12 Nov 2003
Software Implementation
Presentation transcript:

11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI), B Johnson (LHO), S Koranda (UWM), D Kozak (CIT), G Mendell (LHO), H Pulapaka (CIT), I Yakushin (LLO) LSC Meeting, LIGO Hanford Observatory, Nov 9-13 Detector Characterization Session

11/12/2003LIGO Document G Z2 Tools: Data Reduction RDS is being made by submitting createRDS user commands to LDAS via createrdsGUI.tcl, a graphical interface to LDAS createRDS (non-GUI version also available) LDAS can perform channel cuts, decimation and merging of LHO/LLO frame files. These operations are supported by the RDS scripts RDS scripts are designed to run 24/7 with minimal supervision, will continue to run through LDAS downtime Provides visual feedback of RDS progress, error states, web monitoring, notices With current optimisation and hardware upgrades, performance at sites is sufficient to reduce data in better than real-time When necessary, scripts can run many createRDS threads in parallel for high-speed reduction S3 RDS scripts use 2 threads for L1 (>4x LHO, >5x LLO) and 1 thread for L2, L3 (>10x real-time)

11/12/2003LIGO Document G Z3

11/12/2003LIGO Document G Z4 Tools: Data Propagation Lightweight Data Replicator (LDR), developed by Scott Koranda, is being used to propagate data from LHO/LLO to CIT and from CIT to Tier 2 centres (MIT, PSU, UWM, AEI) Helper applications (Pulapaka/Koranda/Johnson):  Script for moving data from framebuilder to LDAS tape archive  LDRLockedSegments.tcl - script which retrieves information about locked time segments for each IFO from LDAS  RDS-publish.py - script for verifying RDS files (FrCheck) and publishing meta-data to the LDR database  Local Storage Module - plugin to LDR which organises data into user-defined locations See Scott Koranda’s talk for more details

11/12/2003LIGO Document G Z5 Tools: Monitoring RDS GUI visual alerts LDAS Search Summary Pages on the Web  LDAS CIT web page -  createrds - HTML version of GUI visual alerts, logs  datamon - monitors how long it takes before reduced data from IFO sites is visible in LDAS at CIT, MIT alerts - sent to designated addresses in case of errors eg.  LDAS job submission errors, job failures  RDS data generation rate falling behind  LDAS RDS data visibility falling behind

11/12/2003LIGO Document G Z6 Tools: Validation Prior to Mock Data Challenge:  Installed pre-release LDAS at LHO, LLO, CIT  Installed updated LDR w/ new GLOBUS toolkit, LDR helper apps  Installed RDS GUI, web monitoring RDS MDC, Oct  L1, L2, L3 RDS data generated at LHO, LLO  Tapes shipped to CIT  Data reduced at CIT starting 9 Oct  LDR began moving data from LHO Oct 10, but not “near real-time” E10 used as a re-run of the MDC with LDAS MDC did not achieve all goals but many problems were solved! Far fewer problems in E10, all resolved before S3

11/12/2003LIGO Document G Z7 S3 Reduced Data Set Site-specific files (H and L), gzip compressed Level 0 – raw data, copied to LDAS tape archive at each site as it written by framebuilder  LHO: 9739 channels, 6.7 MB/s. LLO: 4435 channels, 2.9 MB/s  TOTAL: channels, 9.6 MB/s Level 1 – first level of reduction  LHO: 264 channels, ~0.9 MB/s. LLO: 139 channels, ~0.44 MB/s  TOTAL: 403 channels, 1.34 MB/s (~1/7 of L0) Level 2 – second level (requested by PULG)  AS_Q, DARM_CTRL, DARM_CTRL_EXC_DAQ, ETMX_EXC_DAQ, DARM_GAIN, ICMTRX_01, SV_STATE_VECTOR  LHO: 14 channels, 0.17 MB/s. LLO: 7 channels, 0.09 MB/s  TOTAL: 21 channels, 0.26 MB/s (~1/5 of L1) Level 3 – third level (AS_Q only)  LHO: H1 & H2 AS_Q, 0.11 MB/s. LLO: :L1 AS_Q, 0.06 MB/s  TOTAL: 3 channels, 0.17 MB/s (~2/3 of L2)

11/12/2003LIGO Document G Z8 CIT RDS: L1, L2 LHO RDS: L1, L2, L3 LLO RDS: L1, L2, L3 L3 RDS (LDR) L3 RDS (LDR) L0 RAW (FedEx) MIT LDR: L1, L2, L3 PSU LDR: L1, L2, L3 AEI LDR: L1, L2, L3 UWM LDR: L1, L2, L3 L1, L2, L3 RDS (LDR) L1, L2, L3 RDS (LDR) L1 RDS (FedEx)

11/12/2003LIGO Document G Z9 S3 Data Reduction Rates Rate of RDS as a multiple of real-time  LHO: L x, L2 - 18x, L3 - 23x  LLO: L1 - 6x, L2 - 11x, L3 - 13x  CIT: L x, L2 - 23x Time between frame-file GPS time and reduction  LHO: L min, L min, L min  LLO: L1 - 5 min, L min, L min GridFTP Transfer rates  LHO->CIT 4 MB/s, LLO->CIT 0.1 MB/s  CIT->MIT 1 MB/s, CIT->UWM 4 MB/s Delay in L0->L1->L2->L3->CIT->MIT pipeline: Total Level 3 RDS Delay to Tier 2: 2-3 hours