Migration of Monte Carlo Simulation of High Energy Atmospheric Showers to GRID Infrastructure Migration of Monte Carlo Simulation of High Energy Atmospheric.

Slides:



Advertisements
Similar presentations
Distributed Systems Architecture Research Group Universidad Complutense de Madrid EGEE UF4/OGF25 Catania, Italy March 2 nd, 2009 State and Future Plans.
Advertisements

Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Data Management Expert Panel - WP2. WP2 Overview.
Consorzio COMETA - Progetto PI2S2 UNIONE EUROPEA NEMO Monte Carlo Application on the Grid R. Calcagno for the NEMO Collaboration.
DESIGN AND IMPLEMENTATION OF SOFTWARE COMPONENTS FOR A REMOTE LABORATORY J. Fernandez, J. Crespo, R. Barber, J. Carretero University Carlos III of Madrid.
GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain.
ProActive Task Manager Component for SEGL Parameter Sweeping Natalia Currle-Linde and Wasseim Alzouabi High Performance Computing Center Stuttgart (HLRS),
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
2 nd GADA Workshop / OTM 2005 Conferences Eduardo Huedo Rubén S. Montero Ignacio M. Llorente Advanced Computing Laboratory Center for.
The Cactus Portal A Case Study in Grid Portal Development Michael Paul Russell Dept of Computer Science The University of Chicago
EUFORIA FP7-INFRASTRUCTURES , Grant JRA3 B. Guillerminet on behalf of the JRA3 project 22 January 2008 Kick-Off Meeting January 2008.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Gamma-Ray Astronomy Dana Boltuch Ph. D
Leicester Database & Archive Service J. D. Law-Green, J. P. Osborne, R. S. Warwick X-Ray & Observational Astronomy Group, University of Leicester What.
Search of High Energy Cosmic Sources with the Plataforma Solar de Almeria: The GRAAL Experiment Fernando Arqueros Universidad Complutense de Madrid F.
Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe.
Massive Ray Tracing in Fusion Plasmas on EGEE J.L. Vázquez-Poletti, E. Huedo, R.S. Montero and I.M. Llorente Distributed Systems Architecture Group Universidad.
Astronomical GRID Applications at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
Grappa: Grid access portal for physics applications Shava Smallen Extreme! Computing Laboratory Department of Physics Indiana University.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
Glite I/O Storm Testing in EDG-LCG Framework Elena Slabospitskaya, Vadim Petukhov, (IHEP, Russia) Gilbert Grosdidier, (CNRC, France) NEC'2005, Sept 16.
Daniel Vanderster University of Victoria National Research Council and the University of Victoria 1 GridX1 Services Project A. Agarwal, A. Berman, A. Charbonneau,
3 rd DPHEP Workshop CERN, 7-8 December 2009 G. LAMANNA CTA C herenkov Telescope Array Giovanni Lamanna LAPP - Laboratoire d'Annecy-le-Vieux de Physique.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
GumTree Feature Overview Tony Lam Data Acquisition Team Bragg Institute eScience Workshop 2006.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
INFSO-RI Module 01 ETICS Overview Alberto Di Meglio.
A Grid fusion code for the Drift Kinetic Equation solver A.J. Rubio-Montero, E. Montes, M.Rodríguez, F.Castejón, R.Mayo CIEMAT. Avda Complutense, 22. Madrid.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
INFSO-RI Module 01 ETICS Overview Etics Online Tutorial Marian ŻUREK Baltic Grid II Summer School Vilnius, 2-3 July 2009.
Development Timelines Ken Kennedy Andrew Chien Keith Cooper Ian Foster John Mellor-Curmmey Dan Reed.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Hjdh Andrea Misner Astrophysics Student Member of “Team Butler” Supervisor: Dr. Malcolm Butler From Cosmic Rays to Local Classrooms High School Physics.
NOVA Networked Object-based EnVironment for Analysis P. Nevski, A. Vaniachine, T. Wenaus NOVA is a project to develop distributed object oriented physics.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks S. Natarajan (CSU) C. Martín (UCM) J.L.
Atmospheric shower simulation studies with CORSIKA Physics Department Atreidis George ARISTOTLE UNIVERSITY OF THESSALONIKI.
1 Computing Challenges for the Square Kilometre Array Mathai Joseph & Harrick Vin Tata Research Development & Design Centre Pune, India CHEP Mumbai 16.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Replica Management Services in the European DataGrid Project Work Package 2 European DataGrid.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks, Novelties and Features around the GridWay.
1 P-GRADE Portal: a workflow-oriented generic application development portal Peter Kacsuk MTA SZTAKI, Hungary Univ. of Westminster, UK.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
Enabling Grids for E-sciencE Astronomical data processing workflows on a service-oriented Grid architecture Valeria Manna INAF - SI The.
Computing at MAGIC: present and future Javier Rico Institució Catalana de Recerca I Estudis Avançats & Institut de Física d’Altes Energies Barcelona, Spain.
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
12 Oct 2003VO Tutorial, ADASS Strasbourg, Data Access Layer (DAL) Tutorial Doug Tody, National Radio Astronomy Observatory T HE US N ATIONAL V IRTUAL.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Sources emitting gamma-rays observed in the MAGIC field of view Jelena-Kristina Željeznjak , Zagreb.
- GMA Athena (24mar03 - CHEP La Jolla, CA) GMA Instrumentation of the Athena Framework using NetLogger Dan Gunter, Wim Lavrijsen,
1/23 Distributed Systems Architecture Research Group Universidad Complutense de Madrid Nuevos modelos de provisión de recursos para infrestructuras GRID:
H. Kornmayer MAGIC-Grid EGEE, Panel discussion, Pisa, Monte Carlo Production for the MAGIC telescope A generic application of EGEE Towards.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using EGEE middleware: Putting it all together!
ACGT Architecture and Grid Infrastructure Juliusz Pukacki ‏ EGEE Conference Budapest, 4 October 2007.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
1 An unattended, fault-tolerant approach for the execution of distributed applications Manuel Rodríguez-Pascual, Rafael Mayo-García CIEMAT Madrid, Spain.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Enabling Grids for E-sciencE University of Perugia Computational Chemistry status report EGAAP Meeting – 21 rst April 2005 Athens, Greece.
The MAGIC Data Center storage and computing infrastructures in Grid
Module 01 ETICS Overview ETICS Online Tutorials
MonteCarlo production for the BaBar experiment on the Italian grid
Presentation transcript:

Migration of Monte Carlo Simulation of High Energy Atmospheric Showers to GRID Infrastructure Migration of Monte Carlo Simulation of High Energy Atmospheric Showers to GRID Infrastructure Adolfo VAZQUEZ – Jose Luis CONTRERAS Grupo de Altas Energias, Universidad Complutense de Madrid, Madrid, Spain. Aitor IBARRA – Ignacio DE LA CALLE – Daniel TAPIADOR Ingeniería y Servicios Aeroespaciales S.A. (INSA), Madrid, Spain. Presented by: ADOLFO VAZQUEZ ABSTRACT The MAGIC telescope [1], a 17-meter Cherenkov telescope located on La Palma (Canary Islands), is dedicated to the study of the universe in Very High Energy gamma-rays. These particles arrive at the Earth's atmosphere producing atmospheric showers of secondary particles that can be detected on ground with the appropriate detectors. MAGIC is one of such detectors, sensitive to the Cherenkov radiation produced by the charge component of the showers. MAGIC relies on a large number of Monte Carlo simulations [4] for the characterization and identification of the recorded events. The simulations are used to evaluate efficiencies and identify patterns to distinguish between genuine gamma-ray events and unwanted background events. Up to now, these simulations were executed on local queuing systems, resulting in large execution times and a complex organizational task. Due to the parallel nature of these simulations, a Grid-based simulation system is the natural solution [6]. In this work, a system which uses the current resources of the MAGIC Virtual Organization on EGEE is proposed. It can be easily generalized to support the simulation of any similar system, as the planned Cherenkov Telescope Array [2]. The proposed system, based on a Client/Server architecture [3], provides the user with a single access point to the simulation environment through a remote graphical user interface, the Client. The Client can be accessed via web browser, using web service technology, with no additional software installation on the user side required. The Server processes the user request and uses a database [8] for both data catalogue and job management inside the Grid [7]. The design, first production tests and lessons learned from the system will be discussed here. References Grid-enable the remaining MAGIC pipeline (reflector + camera) Multi-agent system for job management Easy and Fast way of accessing large amounts of data (e.g. no download required) Future Work Lessons Learned Reactivation of the MAGIC Virtual Organization on EGEE Configuration of the required services for that VO (VOMS, SEs, CEs …) Testbed environment installation and configuration in collaboration with CIEMAT, PIC and TU Dortmund Porting of the MMCS software (MAGIC simulation tool) to the Grid environment and automatization of the installation of this software on the Grid Development of scripts to test massive simulations on several Grid middlewares (LCG, Glite and EDG) Porting of the Montecarlo simulations to the Grid Installation and configuration of a node based on Globus + Gridway for job management Development of a web-based graphical client (using Java Web Start) for launching Montecarlo simulations using DRMAA API provided by Gridway Research and initial developments on new work areas Metadata for data catalogue and monitorization Development of a multi-agent system for job management Virtual Observatory compliance of output data [1] [2] [3] A. Ibarra, et al. 2005; “Remote Interface to Science Analysis Tools for Grid Architecture: The XMM-Newton SAS Case”. Astronomical Data Analysis Software and Systems XVI. [4] De los Reyes López, Raquel. Tesis doctoral. Search for gamma-ray emission from pulsars with the MAGIC telescope: sensitivity studies, data check and data analysis Universidad Complutense de Madrid [5] D. Heck, J. Knapp, J.N. Capdevielle, G. Schatz, T. Thouw, CORSIKA: A Monte Carlo Code to Simulate Extensive Air Showers [6] Kornmayer, H. Hardt, M., Kunze, M. Bigongiari, C. Mazzucato, M. deAngelis, A. Cabras, G. Forti, A. Frailis, M. Piraccini, M. Delfino, M. A distributed, Grid-based analysis system for the MAGIC telescope [7] Huedo, E. Montero, R.S. Llorente, I.M. The GridWay framework for adaptive scheduling and execution on grids Scalable Computing: Practice and Experience [8] Lefébure, Véronique. Andreeva, Julia. RefDB: The Reference Database for CMS Monte Carlo Production Computing in High Energy and Nuclear Physics (CHEP’03) 1 The particle’s (gamma-ray or proton) interaction with the atmosphere produces a shower of particles 2 The charge particles generate Cherenkov light 3 Cherenkov light is reflected on the telescope mirror and focus on to the camera 4 The reflected light is captured by the camera, producing a pattern determined by nature of the primary particle TEST CASE: Generation of 1.5 million showers with a given energy range and direction in the sky (theta,phi) need to be generated as background for Crab Nebula, the standard candle in high energy gamma-ray astronomy MMCS: Execute Magic Monte Carlo Simulation (MMCS) and store output data Reflector: Simulate telescope optics with reflector program using as input data the MMCS files, and storing output data Camera: Simulate camera response for all the interesting files generated by reflector, and save the output data which will be analised with MARS PHYSICS SIMULATION MAGIC Analysis pipeline: Shower Simulation MMCS [5] + Reflector + Camera More information at: