Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.

Slides:



Advertisements
Similar presentations
1 | M. Sutter | IPE | KIT – die Kooperation von Forschungszentrum Karlsruhe GmbH und Universität Karlsruhe (TH) Michael Sutter H.-J. Mathes,
Advertisements

ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
Status of BESIII Distributed Computing BESIII Workshop, Mar 2015 Xianghu Zhao On Behalf of the BESIII Distributed Computing Group.
Consorzio COMETA - Progetto PI2S2 UNIONE EUROPEA NEMO Monte Carlo Application on the Grid R. Calcagno for the NEMO Collaboration.
Application for Pierre Auger Observatory.
The Pierre Auger Observatory Nicolás G. Busca Fermilab-University of Chicago FNAL User’s Meeting, May 2006.
Future Plans at the Pierre Auger Observatory Lawrence Wiencke Colorado School of Mines Apr , Cambridge 1.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
The Pierre Auger Project By Megan Edwards Bancroft-Rosalie.
The Highest Energy Cosmic Rays Two Large Air Shower Detectors
1 Bridging Clouds with CernVM: ATLAS/PanDA example Wenjing Wu
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
TREND simulation step by step Gu Junhua, Olivier Martineau-Huynh & Valentin Niess March 21st 2014.
CERN - IT Department CH-1211 Genève 23 Switzerland t Monitoring the ATLAS Distributed Data Management System Ricardo Rocha (CERN) on behalf.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
Computing and LHCb Raja Nandakumar. The LHCb experiment  Universe is made of matter  Still not clear why  Andrei Sakharov’s theory of cp-violation.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
YAN, Tian On behalf of distributed computing group Institute of High Energy Physics (IHEP), CAS, China CHEP-2015, Apr th, OIST, Okinawa.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
GEOGRAPHIC LOCATION OF THE PIERRE AUGER OBSERVATORY Northern Hemisphere: Millard County, Utah Southern Hemisphere: Malargüe Mendoza Province Argentina.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Giuseppe Codispoti INFN - Bologna Egee User ForumMarch 2th BOSS: the CMS interface for job summission, monitoring and bookkeeping W. Bacchi, P.
Light weight Disk Pool Manager experience and future plans Jean-Philippe Baud, IT-GD, CERN September 2005.
F DOE Annual Program Review Pierre Auger Project (E881) Peter O. Mazur.
Humberto Salazar (FCFM-BUAP) for the Pierre Auger Collaboration, CTEQ- Fermilab School Lima, Peru, August 2012 Ultrahigh Cosmic Rays: The highest energy.
Metadata Mòrag Burgon-Lyon University of Glasgow.
Networks ∙ Services ∙ People Enzo Capone (GÉANT) LHCOPN/ONE Meeting, LBL Berkeley (USA) LHCONE Application Pierre Auger Observatory 1-2 June.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
P.Auger, a major step: Need high statistics large detection area : 2 x3000 km² Uniform sky coverage 2 sites located in each hemisphere Argentina and USA.
VO AUGER.ORG Preparation and First Applications Enabling Grids for E-sciencE VO AUGER.ORG - Preparation and First Applications J.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Large Simulations using EGEE Grid for the.
MySQL and GRID status Gabriele Carcassi 9 September 2002.
Computing Jiří Chudoba Institute of Physics, CAS.
1 João Espadanal, Patricia Gonçalves, Mário Pimenta Santiago de Compostela 3 rd IDPASC school Auger LIP Group 3D simulation Of Extensive Air.
Where do ultra-high energy cosmic rays come from? No one knows the origin of ultra-high energy cosmic rays. The majority of low-energy cosmic ray particles.
The GridPP DIRAC project DIRAC for non-LHC communities.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Simulations and Offline Data Processing for.
1 Cherenkov Telescope Array: a production system prototype L. Arrabito 1 C. Barbier 2, J. Bregeon 1, A. Haupt 3, N. Neyroud 2 for the CTA Consortium 1.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
September 16, 2005VI DOSAR – IFUSP/IFT – Philippe Gouffon Auger Project Computing Needs The Pierre Auger Observatory Auger data and processing Auger data.
My Jobs at CERN April 2015 My Jobs at CERN2
The GridPP DIRAC project DIRAC for non-LHC communities.
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
Enhancements to the Observatory: HEAT and AMIGA Peter O. Mazur Director’s Review December 15,
INFSO-RI Enabling Grids for E-sciencE VOCE & AUGER User Support a Current State & Future Plans Jan Kmuníček, Jiří Chudoba CESNET.
Andrea Manzi CERN EGI Conference on Challenges and Solutions for Big Data Processing on cloud 24/09/2014 Storage Management Overview 1 24/09/2014.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES The Common Solutions Strategy of the Experiment Support group.
Jim Matthews Louisiana State University Results from the Pierre Auger Observatory ECRS, Moscow, 4 July
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
DIRAC for Grid and Cloud Dr. Víctor Méndez Muñoz (for DIRAC Project) LHCb Tier 1 Liaison at PIC EGI User Community Board, October 31st, 2013.
1 DIRAC Project Status A.Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille 10 March, DIRAC Developer meeting.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Pierre Auger Observatory Jiří Chudoba Institute of Physics and CESNET, Prague.
Michael Prouza Center for Particle Physics Institute of Physics Academy of Sciences of the Czech Republic Prague Studies of the ultra-high energy cosmic.
Management of the Data in Auger Jean-Noël Albert LAL – Orsay IN2P3 - CNRS ASPERA – Oct Lyon.
Lessons learned administering a larger setup for LHCb
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Feedback to sites from the VO auger Jiří Chudoba (Institute of Physics and.
Oxana Smirnova, Jakob Nielsen (Lund University/CERN)
Report of Dubna discussion
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Pierre Auger Observatory Present and Future
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
Search for coincidences and study of cosmic rays spectrum
Use Case for controlling the Pierre Auger Observatory remotely
Presentation transcript:

Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET

Properties of Ultra High Energy Cosmic Rays Sources Energy spectrum Composition Cause of the flux suppression Measurement of cosmic rays showers

Surface detector 3000 km water Cherenkov detectors Spacing 1500 m (E> eV) infill array 750 m (E> eV) Fluorescence detector 4x6 (FD) + 3 (HEAT) telescopes on 4 sites Radio detection techniques AERA stations (17 km 2 ) Auger Upgrade adding scintillation detectors installation

More than 500 members from 16 countries Argentina Australia Brasil Colombia* Czech Republic France Germany Italy Mexico Netherlands Poland Portugal Romania Slovenia Spain USA *associated Full members Associate members

Real data processing and storage at CC IN2P3 (plus mirror at FERMILAB) Monte Carlo simulations on available resources libraries of showers with different parameters mostly CORSIKA plus Auger Offline models, energy, angle, primary particle,... several CPU hours per 1 shower resources: local farms grid sites since 2007

Evolution from set of scripts around lcg mw to production system with MySQL backend Django dashboard, PHP scripts, SimDB A lot of effort over several years Continuous maintenance required A lot of effort over several years Continuous maintenance required

4000 cores > jobs/month

Propagates users to VOMS servers Creates also local accounts on UIs bWe have a username – how to directly register users to DIRAC?

WMS, LFC not used by LHC experiments DIRAC Welcomed Features Pilot jobs better usage of some sites Monitoring and accounting better control over individual users productions “Free” Support DIRAC instance managed by external team Continuous development See other talks about DIRAC at CHEP15 for more DIRAC features

Parametric jobs Integer parameter defines line in a file with parameters (random number seed, run nr.) jobs submitted in 150 s Easy for standard CORSIKA production Other workflows more difficult

Tested on the full catalog (36 mil. files) Performance verified on test instances No bulk migration New production to DFC keep LFC running for needed period Use DFC as Metadata Catalog too DecisionDecision

First tests with OSG but different path for EU and USA sites EGI CVMFS Task force lead by Catalin Condurache All Auger EGI sites now have /cvmfs/auger.egi.eu months to complete Experiments with end users desktops and local clusters attention to correct cache setup

Datasets implementation dataset = set of files matching metaquery static and dynamic datasets cached informational data (#files, size,...) methods for adding, removing, status query, releasing, freezing, replication, distribution over SEs Implementation done by M. Adam (NPI CAS, CZ)

Datasets implementation Implementation done by M. Adam (NPI CAS, CZ)

Datasets implementation Implementation done by M. Adam (NPI CAS, CZ)

Use DFC for new productions Use DIRAC instance at CC IN2P3 for job management Update current framework with calls to DIRAC Investigate data management tools Deletion, transfer services, consistency checks Define and use metadata Evaluate datasets properties Evaluate web portal usage for users

Usage of Clouds Usage of short term manpower (students) Exchange solutions with other projects