Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe.

Slides:



Advertisements
Similar presentations
1 Observation of the Crab Nebula with the MAGIC Telescope M. López-Moya Univ. Complutense de Madrid, on behalf of the MAGIC Collaboration.
Advertisements

5/2/2005MAGIC II Workshop – RömerTurm1 The Monte Carlo Center Presently 5 ( + 5) XEON bi-processors at 3.0 CNAF (Bologna) allocated to MAGIC. During.
Preliminary estimate of performances using a 2- telescope system CTA meeting E. Carmona on behalf of the MAGIC collaboration Berlin, 5 May 2006.
FP6−2004−Infrastructures−6-SSA Data Grid Infrastructure for YBJ-ARGO Cosmic-Ray Project Gang CHEN, Hongmei ZHANG - IHEP.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
The CrossGrid project Juha Alatalo Timo Koivusalo.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
H. Kornmayer MAGIC-GRID Status report EGAAP meeting, Athens, 21th April EnablingGrids for E-sciencE Benefits of the MAGIC Grid Status report of.
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
3 rd DPHEP Workshop CERN, 7-8 December 2009 G. LAMANNA CTA C herenkov Telescope Array Giovanni Lamanna LAPP - Laboratoire d'Annecy-le-Vieux de Physique.
Observations of 3C 279 with the MAGIC telescope M.Teshima 1, E.Prandini 2, M.Errando, D. Kranich 4, P.Majumdar 1, M.Mariotti 2 and V.Scalzotto 2 for the.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE iASTRO MC MEETING&WORKSHOP, 27-30, APRIL, 2005,SOFIA, BULGARIA Introduction to Grid Technologies.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Claudio Grandi INFN Bologna CHEP'03 Conference, San Diego March 27th 2003 Plans for the integration of grid tools in the CMS computing environment Claudio.
H. Kornmayer MAGIC-GRID Status report EGAAP meeting, Athens, 21th April EnablingGrids for E-siencE Generic Application: Monte Carlo Production.
MAGIC observations of Galactic sources
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
F. Goebel, MPI München, 14. June 2004, EGAAP, CERN Florian Goebel Max-Planck-Institut für Physik (Werner-Heisenberg-Institut) München for the MAGIC collaboration.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
Cracow Grid Workshop October 2009 Dipl.-Ing. (M.Sc.) Marcus Hilbrich Center for Information Services and High Performance.
EGEE is a project funded by the European Union under contract IST Status of NA4 Generic Applications Roberto Barbera Univ. di Catania e INFN.
Migration of Monte Carlo Simulation of High Energy Atmospheric Showers to GRID Infrastructure Migration of Monte Carlo Simulation of High Energy Atmospheric.
Gus Sinnis Asilomar Meeting 11/16/2003 The Next Generation All-Sky VHE Gamma-Ray Telescope.
Multi-TeV  -ray Astronomy with GRAPES-3 Pravata K Mohanty On behalf of the GRAPE-3 collaboration Tata Institute of Fundamental Research, Mumbai Workshop.
Computing at MAGIC: present and future Javier Rico Institució Catalana de Recerca I Estudis Avançats & Institut de Física d’Altes Energies Barcelona, Spain.
The MAGIC Telescope MAGIC
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Large Simulations using EGEE Grid for the.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
CERN, DataGrid PTB, April 10, 2002 CrossGrid – DataGrid Collaboration (Framework) Marian Bubak and Bob Jones.
May Donatella Lucchesi 1 CDF Status of Computing Donatella Lucchesi INFN and University of Padova.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Status Ian Bird EGEE Operations Manager CERN Geneva, Switzerland ISGC, Taipei.
June 6, 2006 CALOR 2006 E. Hays University of Chicago / Argonne National Lab VERITAS Imaging Calorimetry at Very High Energies.
INFSO-RI Enabling Grids for E-sciencE CRAB: a tool for CMS distributed analysis in grid environment Federica Fanzago INFN PADOVA.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
INFSO-RI Enabling Grids for E-sciencE Comp Chem and MAGIC (based on info from Den Haag and follow-up since) F Harris on behalf of.
Sources emitting gamma-rays observed in the MAGIC field of view Jelena-Kristina Željeznjak , Zagreb.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Simulations and Offline Data Processing for.
1 Cherenkov Telescope Array: a production system prototype L. Arrabito 1 C. Barbier 2, J. Bregeon 1, A. Haupt 3, N. Neyroud 2 for the CTA Consortium 1.
H. Kornmayer MAGIC-Grid EGEE, Panel discussion, Pisa, Monte Carlo Production for the MAGIC telescope A generic application of EGEE Towards.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
Status of the MAGIC Telescope Project Presented by Razmick Mirzoyan On behalf of the MAGIC Collaboration Max-Planck-Institute for Physics (Werner-Heisenberg-Institute)
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using EGEE middleware: Putting it all together!
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
Bob Jones EGEE Technical Director
Workload Management Workpackage
Regional Operations Centres Core infrastructure Centres
BaBar-Grid Status and Prospects
Eleonora Luppi INFN and University of Ferrara - Italy
The EUSO-SPB mission Valentina Scotti INFN Sezione di Napoli.
Moving the LHCb Monte Carlo production system to the GRID
LHC experiments Requirements and Concepts ALICE
The MAGIC Data Center storage and computing infrastructures in Grid
OpenGATE meeting/Grid tutorial, mars 9nd 2005
a VO-oriented perspective
LHC Data Analysis using a worldwide computing grid
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
Gridifying the LHCb Monte Carlo production system
MonteCarlo production for the BaBar experiment on the Italian grid
Use Case for controlling the Pierre Auger Observatory remotely
Presentation transcript:

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe C. Bigongiari, INFN Padua A. deAngelis, M. Paraccini, A. Forti, University of Udine M. Delfino, PIC Barcelona M. Mazzucato, CNAF Bologna For the MAGIC Collaboration A distributed Grid-based analysis system for the MAGIC telescope Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken2 Outline What kind of MAGIC? Telescope requirements The basic architecture of distributed system First results the future

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken3 MAGIC - introduction MAGIC Telescope Ground based Air Cerenkov Telescope LaPalma, Canary Islands ( 28° North, 18° West ) 17 m diameter operation since autumn 2003 (still in commissioning) Collaborators: IFAE Barcelona, UAB Barcelona, Humboldt U. Berlin, UC Davis, U. Lodz, UC Madrid, MPI München, INFN / U. Padova, U. Potchefstrom, INFN / U. Siena, Tuorla Observatory, INFN / U. Udine, U. Würzburg, Yerevan Physics Inst., ETH Zürich

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken4 MAGIC – physics goals AGN Active Galactic NucleiSupernova RemnantsUnidentified EGRET sources (Mkn 501, Mkn 421) Gamma Ray Bursts etc.. Crab nebular by Chandra

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken5 MAGIC – ground based γ-ray astronomy 1.Primary particle creates an extensive air shower 2.Secondary particles produce cerenkov light 3.Telescope collects the Cerenkov light to the focal plan 4.The camera system DAQ records the showers Gamma ray flux is low  huge collection area is required  only ground based observations possible The cosmic rays consist mainly of hadronic primaries. A gamma/hadron separation based on MC simulations is needed.

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken6 MAGIC – the camera 577 Photomultiplier tubes ~ 3.5° FOV Read out with a 300 MHz FADC system Readout is based on multi level triggers Gamma ProtonMuon

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken7 MAGIC – Monte Carlo simulations Based on the air shower simulation program CORSIKA Simulation of hadronic background is very CPU consuming – to simulate the background of one night, 70 CPUs (P4 2GHz) needs to run days – to simulate the gamma events of one night for a Crab like source takes 288 days. – The detector/atmosphere is volatile. A good production strategy is needed.

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken8 MAGIC - summary European collaboration In operation since 1 year, still in commissioning phase parameters: – field of view 3.5 o – current trigger threshold ~ 50 GeV – current trigger rate ~ 200 Hz (typical) – current data volume / hour ~ GB Intensive Monte Carlo studies needed to lower the threshold A second telescope is already funded.

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken9 MAGIC - summary European collaboration In operation since 1 year, but still in commissioning phase parameters: – field of view 3.5 o – current trigger threshold ~ 50 GeV – current trigger rate ~ 200 Hz (typical) – current data volume / hour ~ GB Intensive Monte Carlo studies needed to lower the threshold A second telescope is already funded. Crab (March ) Mkn 421 (April 2004)

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken10 Requirements Storage/amount of data: real data: > 7 TB per year MC data: > 10 TB per year Easy access to data Computation: MC today: 100 CPUs need for > 1000 CPU Analysis: today: 40 CPUs need for > 200 new methods in future General: Distributed Collaborators in whole Europe Secure restricted access Accessiblity easy access via a Portal easy access to data Availablity 24x7 Scalability MAGIC II is coming collaboration with other ACT

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken11 Basic Architecture MAGIC backbone: three computing centers CNAF (Italy) PIC (Spain) FZK (Germany) + LaPalma run main services data storage job scheduling Portal collaborators „plugin“ their resources in the backbone

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken12 Basic architecture II Metadata Catalog needed to classify the observation data Usage of existing components: based on LCG 2 CE/SE/UI Resource Broker Replica location service CrossGrid Portal solution Migrating Desktop Roaming Access Server JSS (Job Submission Service) Meta Data Catalogs select real data with astronomical sources select mc data with input parameters Implementation of services will start with the developments for MonteCarlo production

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken13 Run and control simulations in the distributed system -Driven by three use cases - Submit jobs - Monitor jobs - Manage data -Easy to use GUI -Hide LCG commands -Java swing GUI -Job Monitoring and Data Management based on a dedicated database MAGIC Grid Simulation Tool

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken14 CrossGrid Used for first implementation steps LCG-2 testbed with extentions from CrossGrid 16 sites in Europe First tests done 200 jobs submitted 10% failed due to problems with setup (replica location service) Easy resubmit due to the MAGIC run database

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken15 MAGIC and EGEE EGEE is the biggest Grid project in Europe MAGIC applied as a generic application (NA4) Proposal was accepted in June/July Collaboration with EGEE started since august First Results: - Virtuelle Organisation VO-MAGIC set up at NIKHEF/SARA - Usage of GILDA testbed (Cantania) agreed - Integration of first site of MAGIC collaborators planned for the end of the year

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken16 Conclusion & the Future The MAGIC Grid is a good example for a distributed simulation and analysis system can be used to exploit the existing Grid middleware is on the way and first results can be seen is a prototype for the astroparticle physics can help to build collaboration with other experiments (ACT, satellites, Optical telescope,..)