A GRID solution for Gravitational Waves Signal Analysis from Coalescing Binaries: preliminary algorithms and tests F. Acernese 1,2, F. Barone 2,3, R. De.

Slides:



Advertisements
Similar presentations
A walk through some statistic details of LSC results.
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Computational Steering on the GRID Using a 3D model to Interact with a Large Scale Distributed Simulation in Real-Time Michael.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
Tools and Services for the Long Term Preservation and Access of Digital Archives Joseph JaJa, Mike Smorul, and Sangchul Song Institute for Advanced Computer.
Stuart K. PatersonCHEP 2006 (13 th –17 th February 2006) Mumbai, India 1 from DIRAC.Client.Dirac import * dirac = Dirac() job = Job() job.setApplication('DaVinci',
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
EU 2nd Year Review – Jan – WP9 WP9 Earth Observation Applications Demonstration Pedro Goncalves :
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
Test Configuration for Control For the test configuration we used a VME based control system, constituted by a VME crate with a VMPC4a from Cetia (CPU.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
Gridbus Resource Broker for Application Service Costs-based Scheduling on Global Grids: A Case Study in Brain Activity Analysis Srikumar Venugopal 1, Rajkumar.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
Universität Stuttgart Universitätsbibliothek Information Retrieval on the Grid? Results and suggestions from Project GRACE Werner Stephan Stuttgart University.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Standard FTP and GridFTP protocols for international data transfer in Pamela Satellite Space Experiment R. Esposito 1, P. Mastroserio 1, F. Taurino 1,2,
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
CHEP 2000, Giuseppe Andronico Grid portal based data management for Lattice QCD data ACAT03, Tsukuba, work in collaboration with A.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
Miguel Branco CERN/University of Southampton Enabling provenance on large-scale e-Science applications.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Distribution After Release Tool Natalia Ratnikova.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
CRISP & SKA WP19 Status. Overview Staffing SKA Preconstruction phase Tiered Data Delivery Infrastructure Prototype deployment.
Multidimensional classification of burst triggers from the fifth science run of LIGO Soma Mukherjee CGWA, UTB GWDAW11, Potsdam, 12/18/06 LIGO-G
What is Triana?. GAPGAP Triana Distributed Work-flow Network Action Commands Workflow, e.g. BPEL4WS Triana Engine Triana Controlling Service (TCS) Triana.
GO-ESSP Workshop, LLNL, Livermore, CA, Jun 19-21, 2006, Center for ATmosphere sciences and Earthquake Researches Construction of e-science Environment.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
…building the next IT revolution From Web to Grid…
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Seismic Noise Coherence Measurements in Deep Salt Mines
Searching for Gravitational Waves from Binary Inspirals with LIGO Duncan Brown University of Wisconsin-Milwaukee for the LIGO Scientific Collaboration.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
A Grid Approach to Geographically Distributed Data Analysis for Virgo F. Barone, M. de Rosa, R. De Rosa, R. Esposito, P. Mastroserio, L. Milano, F. Taurino,
LIGO-G E Network Analysis For Coalescing Binary (or any analysis with Matched Filtering) Benoit MOURS, Caltech & LAPP-Annecy March 2001, LSC Meeting.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks CSTgrid: a whole genome comparison tool for.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
INFN “Grid Information Service” evaluation Giuseppe Lo Biondo - INFN Sez. Di Milano Giulietta Vita Finzi - INFN CNAF Padova June
THE GLUE DOMAIN DEPLOYMENT The middleware layer supporting the domain-based INFN Grid network monitoring activity is powered by GlueDomains [2]. The GlueDomains.
All-sky search for continuous gravitational waves: tests in a grid environment Cristiano Palomba INFN Roma1 Plan of the talk: Computational issues Computing.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
10 March Andrey Grid Tools Working Prototype of Distributed Computing Infrastructure for Physics Analysis SUNY.
Developing GRID Applications GRACE Project
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
A Data Handling System for Modern and Future Fermilab Experiments Robert Illingworth Fermilab Scientific Computing Division.
PLATFORM TO EASE THE DEPLOYMENT AND IMPROVE THE AVAILABILITY OF TRENCADIS INFRASTRUCTURE IberGrid 2013 Miguel Caballer GRyCAP – I3M - UPV.
VIEWS b.ppt-1 Managing Intelligent Decision Support Networks in Biosurveillance PHIN 2008, Session G1, August 27, 2008 Mohammad Hashemian, MS, Zaruhi.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
G. Russo, D. Del Prete, S. Pardi Kick Off Meeting - Isola d'Elba, 2011 May 29th–June 01th A proposal for distributed computing monitoring for SuperB G.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
DataGrid France 12 Feb – WP9 – n° 1 WP9 Earth Observation Applications.
Grid Services for Digital Archive Tao-Sheng Chen Academia Sinica Computing Centre
24 th Pacific Coast Gravity Meeting, Santa Barbara LIGO DCC Number: G Z 1 Search for gravitational waves from inspiraling high-mass (non-spinning)
Grid Computing: Running your Jobs around the World
SuperB – INFN-Bari Giacinto DONVITO.
Eleonora Luppi INFN and University of Ferrara - Italy
MATLAB Distributed, and Other Toolboxes
The VIRGO DATA ANALYSIS Fulvio Ricci
Pasquale Migliozzi INFN Napoli
Presentation transcript:

A GRID solution for Gravitational Waves Signal Analysis from Coalescing Binaries: preliminary algorithms and tests F. Acernese 1,2, F. Barone 2,3, R. De Rosa 1,2, R. Esposito 2, P. Mastroserio 2, L. Milano 1,2, S. Pardi 1, K. Qipiani 2, G. Spadaccini 1,2 1 Dip. Scienze Fisiche, Univ. Napoli, Italy, I INFN, sez. Napoli, Italy, I Dip. Scienze Farmaceutiche, Univ. Salerno, Italy, I Motivation The procedure for the extraction of gravitational wave signals coming from coalescing binaries provided by the output signal of an interferometric antenna may require computing powers generally not available in a single computing centre or laboratory. In order to overcome this problem, one of the possible solutions is that of using the computing power available in different places as a single geographically distributed computing system. This possibility is now effective within the GRID environment, that allows distributing the required computing effort for specific data analysis procedure among different sites according to the available computing power. Within this environment we decided to develop a system prototype with application software for the first experimental tests of a geographically distributed computing system for the analysis of gravitational wave signal from coalescing binary systems. These tests have been performed within a small GRID environment based on three local nodes: the Bologna node, acting as Tier-1, and the Napoli and Roma nodes, both acting as Tier-2. The tests were aimed: 1- to get information both on the reliability and the performances of the physics application software developed for the coalescing binary analysis by the Napoli group; 2- to test the efficiency of integration special tools developed for GRID within this specific application.Motivation The procedure for the extraction of gravitational wave signals coming from coalescing binaries provided by the output signal of an interferometric antenna may require computing powers generally not available in a single computing centre or laboratory. In order to overcome this problem, one of the possible solutions is that of using the computing power available in different places as a single geographically distributed computing system. This possibility is now effective within the GRID environment, that allows distributing the required computing effort for specific data analysis procedure among different sites according to the available computing power. Within this environment we decided to develop a system prototype with application software for the first experimental tests of a geographically distributed computing system for the analysis of gravitational wave signal from coalescing binary systems. These tests have been performed within a small GRID environment based on three local nodes: the Bologna node, acting as Tier-1, and the Napoli and Roma nodes, both acting as Tier-2. The tests were aimed: 1- to get information both on the reliability and the performances of the physics application software developed for the coalescing binary analysis by the Napoli group; 2- to test the efficiency of integration special tools developed for GRID within this specific application. Conclusions  We have successfully verified that multiple jobs can be submitted and the output retrieved with small overhead time;  Computational grids seems very suitable to perform data analysis for coalescing binaries searches.Conclusions  We have successfully verified that multiple jobs can be submitted and the output retrieved with small overhead time;  Computational grids seems very suitable to perform data analysis for coalescing binaries searches.. GRID GRID is an infrastructure that enables the integrated, collaborative use of high-end computers, networks, databases, and scientific instruments owned and managed by multiple organisations. GRID applications often involve large amounts of data and/or computing and often require secure resource sharing across organisational boundaries, and are thus not easily handled by today's Internet and Web infrastructures. Within this context the Globus project is developing fundamental technologies needed to build computational grids.GRID Matched Filter Algorithm It is the best procedure in searching of known waveform embedded in noise background. Despite of its optimal character,it requires a high computational cost. In fact, this method is based on an exhaustive comparison between the signal and all the possible waveforms taken in account, called templates. If the number of templates increases, the quality of the signal identification increases but a very large number of operations is needed Matched Filter Algorithm It is the best procedure in searching of known waveform embedded in noise background. Despite of its optimal character,it requires a high computational cost. In fact, this method is based on an exhaustive comparison between the signal and all the possible waveforms taken in account, called templates. If the number of templates increases, the quality of the signal identification increases but a very large number of operations is needed Steps Steps of the procedure Step 1 The data were extracted from CNAF-Bologna Mass Storage System. The extraction process reads the VIRGO standard frame format, performs a simple resampling and publishes the selected data file on the Storage Element; Step 2 The search was performed dividing the template space in 200 subspace and submitting from Napoli User Interface a job for each template subspace. Each job reads the selected data file from the Storage Element (located at CNAF-Bologna) and runs on the Worker Nodes selected by Resource Broker in the VIRGO VO. Finally, the output data of each job were retrieved from Napoli User Interface Steps Steps of the procedure Step 1 The data were extracted from CNAF-Bologna Mass Storage System. The extraction process reads the VIRGO standard frame format, performs a simple resampling and publishes the selected data file on the Storage Element; Step 2 The search was performed dividing the template space in 200 subspace and submitting from Napoli User Interface a job for each template subspace. Each job reads the selected data file from the Storage Element (located at CNAF-Bologna) and runs on the Worker Nodes selected by Resource Broker in the VIRGO VO. Finally, the output data of each job were retrieved from Napoli User Interface Conditions of the test procedure Algorithm: standard matched filters Templates generated at PN order 2 with Taylor approximants Data Data simulated at 20 kHz Each data frame is 1 second long Total data length: 600 s Conditions raw data resampled at 2 kHz lower frequency: 60 Hz upper frequency: 1 kHz search space: 2 – 10 solar masses minimal match: 0.97 Number of templates: ~ Conditions of the test procedure Algorithm: standard matched filters Templates generated at PN order 2 with Taylor approximants Data Data simulated at 20 kHz Each data frame is 1 second long Total data length: 600 s Conditions raw data resampled at 2 kHz lower frequency: 60 Hz upper frequency: 1 kHz search space: 2 – 10 solar masses minimal match: 0.97 Number of templates: ~ Scheme of the computational GRID used during the tests Computing Element Worker Node 1 Worker Node 3 Storage Element Worker Node 2CNAF-Bologna Resource Broker Information Index Replica Catalogue Computing Element Worker Node 1 Worker Node 2 User Interface INFN Roma1 Computing Element Worker Node 1 Worker Node 2 User Interface INFN Napoli GARR Data Storage Element Map of the templates used in the test (tau space) Università degli Studi di Napoli “Federico II” Istituto Nazionale di Fisica Nucleare - Sezione di Napoli Università degli Studi di Salerno CHEP03 - March 24-28, La Jolla, California