Phase 2 of the Physics Data Challenge ‘04 Latchezar Betev ALICE Offline week Geneva, September 15, 2004.

Slides:



Advertisements
Similar presentations
DataTAG WP4 Meeting CNAF Jan 14, 2003 Interfacing AliEn and EDG 1/13 Stefano Bagnasco, INFN Torino Interfacing AliEn to EDG Stefano Bagnasco, INFN Torino.
Advertisements

Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
ALICE Operations short summary LHCC Referees meeting June 12, 2012.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Patricia Méndez Lorenzo (IT/GS) ALICE Offline Week (18th March 2009)
EGEE is a project funded by the European Union under contract IST ALICE network stress tests Roberto Barbera NA4 Generic Applications Coordinator.
Production test on EDG-1.4 Goal 1: simulate and reconstuct 5000 Pb-Pb central events 1 job/event Output size: about 1.8 GB/event, so 9 TB Job duration:
ATLAS DC2 Pile-up Jobs on LCG Atlas DC Meeting February 2005.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES P. Saiz (IT-ES) AliEn job agents.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
Status of the production and news about Nagios ALICE TF Meeting 22/07/2010.
F. Fassi, S. Cabrera, R. Vives, S. González de la Hoz, Á. Fernández, J. Sánchez, L. March, J. Salt, A. Lamas IFIC-CSIC-UV, Valencia, Spain Third EELA conference,
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Status of PDC’06 Latchezar Betev TF meeting – September 28, 2006.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
PDC’06 – production status and issues Latchezar Betev TF meeting – May 04, 2006.
ROOT and Federated Data Stores What Features We Would Like Fons Rademakers CERN CC-IN2P3, Nov, 2011, Lyon, France.
Update on replica management
The ALICE Distributed Computing Federico Carminati ALICE workshop, Sibiu, Romania, 20/08/2008.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Working with AliEn Kilian Schwarz ALICE Group Meeting April
CERN – Alice Offline – Thu, 20 Mar 2008 – Marco MEONI - 1 Status of Cosmic Reconstruction Offline weekly meeting.
PROOF and ALICE Analysis Facilities Arsen Hayrapetyan Yerevan Physics Institute, CERN.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 WP8 - Demonstration ALICE – Evolving.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
EGEE is a project funded by the European Commission under contract IST NA4/HEP work F Harris (Oxford/CERN) M.Lamanna(CERN) NA4 Open meeting.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Computing for Alice at GSI (Proposal) (Marian Ivanov)
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
FTS monitoring work WLCG service reliability workshop November 2007 Alexander Uzhinskiy Andrey Nechaevskiy.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The LCG interface Stefano BAGNASCO INFN Torino.
DCDB A. Sandoval TPC meeting, Tanka Village 16 May 2004.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES L. Betev, A. Grigoras, C. Grigoras, P. Saiz, S. Schreiner AliEn.
ARDA P.Cerello – INFN Torino ARDA Workshop June
Christmas running post- mortem (Part III) ALICE TF Meeting 15/01/09.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
ALICE experiences with CASTOR2 Latchezar Betev ALICE.
Status of AliEn2 Services ALICE offline week Latchezar Betev Geneva, June 01, 2005.
Dynamic staging to a CAF cluster Jan Fiete Grosse-Oetringhaus, CERN PH/ALICE CAF / PROOF Workshop,
Service Challenge Report Federico Carminati GDB – January 11, 2006.
8 August 2006MB Report on Status and Progress of SC4 activities 1 MB (Snapshot) Report on Status and Progress of SC4 activities A weekly report is gathered.
PDC’06 - status of deployment and production Latchezar Betev TF meeting – April 27, 2006.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
ALICE Grid operations +some specific for T2s US-ALICE Grid operations review 7 March 2014 Latchezar Betev 1.
Alien and GSI Marian Ivanov. Outlook GSI experience Alien experience Proposals for further improvement.
Phase 2 of the Physics Data Challenge ‘04 Peter Hristov For the ALICE DC team Russia-CERN Joint Group on Computing CERN, September 20, 2004.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
The ALICE Production Patricia Méndez Lorenzo (CERN, IT/PSS) On behalf of the ALICE Offline Project LCG-France Workshop Clermont, 14th March 2007.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
INFNGRID Technical Board, Feb
LCG Service Challenge: Planning and Milestones
Status of the Production
ALICE FAIR Meeting KVI, 2010 Kilian Schwarz GSI.
INFN-GRID Workshop Bari, October, 26, 2004
ALICE Physics Data Challenge 3
ALICE – Evolving towards the use of EDG/LCG - the Data Challenge 2004
Offline shifter training tutorial
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
MC data production, reconstruction and analysis - lessons from PDC’04
Simulation use cases for T2 in ALICE
The LHCb Computing Data Challenge DC06
Presentation transcript:

Phase 2 of the Physics Data Challenge ‘04 Latchezar Betev ALICE Offline week Geneva, September 15, 2004

2 Status of PDC04 15 Sep. 2004, Alice Offline week Outline  Purpose and conditions of Phase 2  Job structure and improvements to AliEn  Statistics (up to today)  Problems  Toward phase 3  Conclusions

3 Status of PDC04 15 Sep. 2004, Alice Offline week Phase 2 purpose and tasks  Mixing of signal events with different physics content into the underlying Pb+Pb events (underlying events are reused several times)  Test of:  Standard production of signal events  Stress test of network and file transfer tools  Storage at remote SEs, stability (crucial for phase 3)  Conditions, jobs …:  62 different conditions  340K jobs, 15.2M events  10 TB produced data  200 TB data transfer from CERN  500 MSI2K hours CPU

4 Status of PDC04 15 Sep. 2004, Alice Offline week  Repartition of tasks (physics signals):

5 Status of PDC04 15 Sep. 2004, Alice Offline week  Structure of event production in phase 2: Master job submission, Job Optimizer (N sub-jobs), RB, File catalogue, processes monitoring and control, SE… Central servers CEs Sub-jobs Job processing AliEn-LCG interface Sub-jobs RB Job processing CEs Storage CERN CASTOR: underlying events Local SEs CERN CASTOR: backup copy Storage Primary copy Local SEs Output files Underlying event input files zip archive of output files Register in AliEn FC: LCG SE: LCG LFN = AliEn PFN edg(lcg) copy&register File catalogue

6 Status of PDC04 15 Sep. 2004, Alice Offline week  jets master job jdl: 12 Input files from Phase 1 8 configuration files, job steering and validation scripts 4 output files (local SE), 1 backup zip copy (CERN CASTOR), 4 log files (CERN SE)

7 Status of PDC04 15 Sep. 2004, Alice Offline week To make this possible  AliEn system improvements:  AliEn processes tables – split in “running” (lightweight) and “done” (archive) – allows for faster process tracking  Implemented symbolic links and event groups (through sophisticated search algorithms):  Number of underlying events are grouped (through symbolic links) in a directory for a specific signal event type – example 1660 underlying events will be used for each jet signal condition. Another 1660 will be used for the next and so on up to in total (12 conditions)  Implemented zip archiving, mainly to overcome the limitations of the taping systems (less files, large size)  Fast resubmission of failed jobs – in this phase all jobs must finish  New job monitoring tools, including singe job trace logs from start to finish with logical steps and timing  LCG improvements:  See talk of Piergiorgio Cerello

8 Status of PDC04 15 Sep. 2004, Alice Offline week  Phase 2 statistics (start July 2004 – end September 2004):  Jet signals: unquenched and quenched, cent 1: 90% complete  Jet signals: unquenched per1: 60% copmlete  Special TRD production at CNAF: phase 1 running  Number of jobs: 75K (number of done jobs/day is accelerating)  Number of output files: 375K data, 300K log  Data volume: 3.2 TB at local SEs, 3.2 TB at CERN (backup)  Job duration: 2h 30min cent1, 1h 20min per1:  Careful profiling of AliRoot and cleaning up of the code has reduced the processing time by a factor of 2!

9 Status of PDC04 15 Sep. 2004, Alice Offline week  Individual sites: CPU contribution  AliEn direct control: 17 CEs, each with a SE; CERN-LCG is encompassing the LCG resources worldwide (also with local/close SEs)

10 Status of PDC04 15 Sep. 2004, Alice Offline week  Individual sites: jobs successfully done

11 Status of PDC04 15 Sep. 2004, Alice Offline week Current problems  AliEn problems:  Proxy server – out of memory due to a spiralling number of proxy connections: attempt to introduce a schema with pre-forked and limited number of proxies was not successful and the problem has to be studied further:  Not a show-stopper – we know what to monitor and how to avoid it  JobOptimizer – due to the very complex structure of the jobs (many files in the input box) the time needed to prepare one job for submission is large and the service sometimes cannot supply enough jobs to fill the available resources:  Not a show stopper now – we are mixing jobs with different execution time length, thus load-balancing the system  Has to be fixed for phase 3, where the input boxes of the jobs will be even larger and the processing time is very short – clever ideas how to speed-up the system already exist  LCG problems:  See talk of Piergiorgio Cerello

12 Status of PDC04 15 Sep. 2004, Alice Offline week Toward Phase 3  Purpose: distributed analysis of the processed in Phase 2 data  AliEn analysis prototype already exsists:  Some poor souls are trying to work with it, but it’s difficult with the production running…  We want to use gLite during this phase as much as possible (and provide feedback)  Service reuirements:  In both Phase 1 and 2 the service quality of the computing centres has been excellent with very short response times in case of problems  Phase 3 will continue until the end of the year:  The remote computing centres will have to continue providing the same excellent level of service  Since the data are stored locally, interruptions of service will fail (or make very slow) the analysis jobs. The backup copy at CERN is on tape only and will take considerable amount of time to stage back in case the local copy is not accessible  The above is valid for the centres directly controlled through AliEn and the LCG sites

13 Status of PDC04 15 Sep. 2004, Alice Offline week Conclusions  Phase 2 of the PDC’04 is about 50% finished and is progressing well, despite its complexity  There is a keen competition for resources at all sites (LHCb and ATLAS are also running massive DCs)  We have not encountered any show-stoppers. All production problems arising are fixed by the AliEn crew very quickly. The response of the experts at the computing centres is very efficient  We are also running a considerable amount of jobs on LCG sites and it is performing very well with more and more resources being made available for ALICE (see talk of Piergiorgio Cerello), thanks to the hard work of the LCG team  In about 3 weeks time we will seamlessly enter the last phase of the PDC’04…  It’s not over yet, but we are getting close!

14 Status of PDC04 15 Sep. 2004, Alice Offline week Acknowledgements  Special tanks to the site experts for the computing and storage resources and for the excellent support : Francesco Minafra – Bari Haavard Helstrup – Bergen Roberto Barbera – Catania Giuseppe Lo Re – CNAF Bologna Kilian Schwarz – FZK Karlsruhe Jason Holland – TLC² Houston Galina Shabratova – IHEP, ITEP, JINR Eygene Ryabinkin – KIAE Moscow Doug Olson – LBL Yves Schutz – CC-IN2P3 Lyon Doug Johnson – OSC Ohio Jiri Chudoba – Golias Prague Andrey Zarochencev – SPBsU St. Petersburg Jean-Michel Barbet – SUBATECH Nantes Mario Sitta – Torino And to: Patricia Lorenzo for bearing with us….