Gregory Mendell, LIGO Hanford Observatory LIGO-G050374-00-WLIGO-G050635-00-WLIGO-G070127-00-W LIGO S5 Reduced Data Set Generation March 2007.

Slides:



Advertisements
Similar presentations
11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
Advertisements

G Z 1 Block-Normal, Event Display and Q-scan Based Glitch and Veto Studies Shantanu Desai for Penn. State Burst Group And Glitch Working Group.
ISG We build general capability Job Submission on the Olympus Cluster J. DePasse; S. Brown, PhD; T. Maiden Pittsburgh Supercomputing Center Public Health.
LIGO- G M State of the LIGO Lab Stan Whitcomb LSC meeting Hanford 17 August 2004.
Searching for correlations in global environmental noise Karl Wette, Susan Scott and Antony Searle ACIGA Data Analysis Centre for Gravitational Physics.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
LIGO Student Investigations
LIGO Reduced Data Sets E7 standard reduced data set RDS generation for future runs decimation examples LIGO-G Z Isabel Leonor (w/ Robert Schofield)
LIGO-G W Gregory Mendell and Mike Landry LIGO Hanford Observatory The StackSlide Search Summary: November 2003.
Linux+ Guide to Linux Certification Chapter 12 Compression, System Backup, and Software Installation.
LIGO- GXXXXXX-XX-X Advanced LIGO Data & Computing Material for the breakout session NSF review of the Advanced LIGO proposal Albert Lazzarini Caltech,
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
Notes on offline data handling M. Moulson Frascati, 29 March 2006.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
Linux+ Guide to Linux Certification
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
G R AJW, Caltech, LSC Meeting, 3/20/02 Discussion of Statistical Methods, Tools, and Simulations  Review of tools and methods  (I am only familiar.
Online Data Challenges David Lawrence, JLab Feb. 20, /20/14Online Data Challenges.
GDS Software Status John Zweizig LIGO/Caltech LSC/Virgo DetChar Session Cambridge, July 25, 2007 LIGO-G Z.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
LIGO Data Analysis (G M) Computing Breakout Session 1 1.LIGO Data Analysis Systems Data Management and Analysis 2.LIGO/LSC Analysis Software:
Linux Operations and Administration
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
Some notes on ezTree and EMC data in MuDst Marco van Leeuwen, LBNL.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
1 Data Flow Diagrams. 2 Identifying Data Flows During the analysis stage of a project it is important to find out how data flows through a system:  Where.
17-April-2007 High Performance Computing Basics April 17, 2007 Dr. David J. Haglin.
LIGO-G I S5 calibration status Michael Landry LIGO Hanford Observatory for the LSC Calibration Committee LSC/Virgo Meeting May 22, 2007 Cascina,
LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech.
Event Data History David Adams BNL Atlas Software Week December 2001.
Offline shifter training tutorial L. Betev February 19, 2009.
9 February 2000CHEP2000 Paper 3681 CDF Data Handling: Resource Management and Tests E.Buckley-Geer, S.Lammel, F.Ratnikov, T.Watts Hardware and Resources.
Status of Standalone Inspiral Code Duncan Brown University of Wisconsin-Milwaukee LIGO Scientific Collaboration Inspiral Working Group LIGO-G Z.
LIGO-G D LSC Meeting Nov An Early Glimpse at the S3 Run Stan Whitcomb (LIGO–Caltech) LIGO Scientific Collaboration Meeting LIGO Hanford.
High Speed Detectors at Diamond Nick Rees. A few words about HDF5 PSI and Dectris held a workshop in May 2012 which identified issues with HDF5: –HDF5.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
PROOF and ALICE Analysis Facilities Arsen Hayrapetyan Yerevan Physics Institute, CERN.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
S.Klimenko, LSC March, 2001 Update on Wavelet Compression Presented by S.Klimenko University of Florida l Outline Ø Wavelet compression concept E2 data.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),
LSC Meeting, March 22, 2002 Peter Shawhan (LIGO/Caltech)LIGO-G E Access to Data from the LIGO Engineering Runs Peter Shawhan (LIGO Lab / Caltech)
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
Handling of T1D0 in CCRC’08 Tier-0 data handling Tier-1 data handling Experiment data handling Reprocessing Recalling files from tape Tier-0 data handling,
Calibration/validation of the AS_Q_FAST channels Rick Savage - LHO Stefanos Giampanis – Univ. Rochester ( Daniel Sigg – LHO )
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
LIGO-G Z Report on the S2 RunK. Riles / S. Whitcomb 1 Report on the S2 Run Keith Riles (University of Michigan) Stan Whitcomb (LIGO–Caltech)
Report on the FCT MDC Stuart Anderson, Kent Blackburn, Philip Charlton, Jeffrey Edlund, Rick Jenet, Albert Lazzarini, Tom Prince, Massimo Tinto, Linqing.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Data Acquisition, Diagnostics & Controls (DAQ)
Compute and Storage For the Farm at Jlab
Status report NIKHEF Willem van Leeuwen February 11, 2002 DØRACE.
WP18, High-speed data recording Krzysztof Wrona, European XFEL
Computational Requirements
Gregory Mendell LIGO Hanford Observatory
Artem Trunov and EKP team EPK – Uni Karlsruhe
ALICE Computing Upgrade Predrag Buncic
Computing Infrastructure for DAQ, DM and SC
Report on the S2 Run Keith Riles (University of Michigan)
Update on LLO-ALLEGRO stochastic analysis
h(t): are you experienced?
M Ashley - LSC Meeting – LIGO Hanford, 12 Nov 2003
The STACK-SLIDE Search Initial Report: PULG F2F UWM June 2003
ALLEGRO calibration for the LLO-ALLEGRO stochastic analysis
Software Implementation
Presentation transcript:

Gregory Mendell, LIGO Hanford Observatory LIGO-G WLIGO-G WLIGO-G W LIGO S5 Reduced Data Set Generation March 2007

What are the RDS frames? RDS frames are subsets of the raw data, with some channels downsampled Level 1 RDS (/archive/frames/S5/L1; type == RDS_R_L1) –separate frames for LHO and LLO: LHO: 131 H1 channels and 143 H2 channels LLO: 181 L1 channels Level 3 RDS (/archive/frames/S5/L3; type == RDS_R_L3) –DARM_ERR, STATE_VECTOR, and SEGNUM; separate frames for LHO and LLO. Level 4 RDS (/archive/frames/S5/L4; type == [H1|H2|L1]_RDS_R_L1) –DARM_ERR downsampled by a factor 4, STATE_VECTOR, and SEGNUM; separate frame for H1, H2 and L1 data.

LIGO Data Products Disk Space Tape 186 GBs/tape; Dual Copy Data Rate 1 Yr 1 Yr Comp. Ratio Look-back LHO raw MB/s 272 TBs 1800 tapes 1.7 on tape ~1 Month LHO Level 1 RDS MB/s 42 TBs 460 tapes 1.2 in files ~most S5* LHO Level 3 RDS MB/s 3.5 TBs 39 tapes 1.2 in files all S5 LHO Level 4 RDS MB/s 0.87 TBs 10 tapes 1.2 in files all S5 LHO h(t) MB/s 8.0 TBs 88 tapes -- all S5 LHO SFTs MB/s 0.96 TBs 11 tapes -- all S5 LLO raw MB/s 133 TBs 970 tapes 1.5 on tape ~2 Months LLO Level 1 RDS MB/s 23 TBs 249 tapes 1.2 in files ~most S5* LLO Level 3 RDS MB/s 1.8 TBs 20 tapes 1.2 in files all S5 LLO Level 4 RDS MB/s 0.45 TBs 5 tapes 1.2 in files all S5 LLO h(t) MB/s 4.0 TBs 44 tapes -- all S5 LLO SFTs MB/s 0.48 TBs 5 tapes -- all S5 Totals: 16.3 MB/s 490 TBs 3701 tapes as above all at CIT! * All of Level 1 RDS will not necessary fit on cluster node disks and/or inside the tape library at the sites. A copy of all data of all types is stored at the sites on tape, either in the tape library or off-line in tape cabinets.

Channel Lists Level 1: Level 3: Level 4: Level 1: L1:LSC-AS_Q 1 L1:LSC-AS_I 2 L1:LSC-POB_Q 2... L1:LSC-DARM_CTRL 1 L1:LSC-DARM_ERR 1... L0:PEM-EY_BAYMIC 1 L0:PEM-EX_BAYMIC 1... Note that one fast channel for 1 yr of S5 data can take up ~ 2 TBs of disk space and cost ~ $2000 to archive. Level 3: H2:LSC-DARM_ERR 1 H2:IFO-SV_STATE_VECTOR 1 H2:IFO-SV_SEGNUM 1 H1:LSC-DARM_ERR 1 H1:IFO-SV_STATE_VECTOR 1 H1:IFO-SV_SEGNUM 1 Level 4: H2:LSC-DARM_ERR 4 H2:IFO-SV_STATE_VECTOR 1 H2:IFO-SV_SEGNUM 1

RDS Monitoring Click here to get directory listing; browse for adc*.txt file to get channel lists

RDS Changes During S5 1. On Nov 4, H1:LSC-POY_DC was changed to H1:LSC-POBS_DC 2. On Nov 8, H2:LSC-SPOB_MON was changed to H2:LSC-SPOB_I 3. On Nov. 14, 2005 these channel were added at LLO: L1:IFO-SV_SEGNUM L1:LSC-AS_Q_1FSR L1:LSC-AS_Q_0FSR 4. On Dec. 13, 2005 these channels were added at LLO: L1:LSC-ETMX_CAL_EXC_DAQ 4 L1:LSC-ETMY_CAL_EXC_DAQ 4 5. On June 15, 2006 new raw data channels were added to Level 1 RDS: L0:PEM-RADIO_ROOF at LLO H0:PEM-RADIO_LVEA_H1 at LHO 6. On Feb. 23, 2006 increased the sample rate of LSC-DARM_CTRL_EXC_DAQ in the Level 1 RDS frames from 4096 Hz to the full rate of Hz. ● Changes are discussed in Run, LDAS, CDS, and DASWG meetings, and with run coordinators, after requests by individuals. ● Should discuss ways to set up standard procedures.

Request for more changes to RDS channels? Channels not in the RDS frames are at CIT in the raw frames on tape; can access in parallel in a few parallel streams from tape, it would take a while to process data from raw frames. Do we want to add/remove channels to Level 1 RDS? (Must consider disk space within the lab and at the LSC sites; access time to the data on tape; cost of archiving.) Do we want to regenerate Level 1 RDS, and/or regenerate a Level 2 RDS data set? (Note new h(t) frames may also carry extra channels for regenerating calibrated data.) Can also consider generating custom RDS data sets at CIT. proposed changes to

Proposed change to RDS compression A proposal to the LSC and S5 run coordinators to consider changing the compression of the RDS frames from gzip to zero_suppress_int_float has been made at a recent S5 run meeting. I have not heard if this was discussed further during the last run meeting: the issue is whether to implement this during S5 or after S5. Both of these compression methods are already in the C and C++ frame libraries, but the latter is not in the fast frame library used by dtt. I have already run tests that show that zero_suppress_int_float would reduce the size of the Level 1 frames by 30% and speed up output (and probably input) by around 20%.

END

createrds.tcl uses Ligotools LDASjob package to submit jobs get segments from LDAS diskcacheAPI run LDAS createRDS jobs Driver Script LDAS RDS frames are written to archive filesystem SAMFS Archive: RDS frames transfer to CIT via LDR LDR MITUWMPSUAEI How are RDS frames generated?