The ALICE Computing F.Carminati May 4, 2006 Madrid, Spain.

Slides:



Advertisements
Similar presentations
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
1 User Analysis Workgroup Update  All four experiments gave input by mid December  ALICE by document and links  Very independent.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
1 Status of the ALICE CERN Analysis Facility Marco MEONI – CERN/ALICE Jan Fiete GROSSE-OETRINGHAUS - CERN /ALICE CHEP Prague.
Production Activities and Requirements by ALICE Patricia Méndez Lorenzo (on behalf of the ALICE Collaboration) Service Challenge Technical Meeting CERN,
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
F.Carminati BNL Seminar March 21, 2005 ALICE Computing Model.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Andreas Morsch, CERN EP/AIP CHEP 2003 Simulation in ALICE Andreas Morsch For the ALICE Offline Project 2003 Conference for Computing in High Energy and.
Status of PDC’07 and user analysis issues (from admin point of view) L. Betev August 28, 2007.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA.
F.Carminati ALICE Computing Model Workshop December 9-10, 2004 Introduction and Overview of the ALICE Computing Model.
AliRoot survey P.Hristov 11/06/2013. Offline framework  AliRoot in development since 1998  Directly based on ROOT  Used since the detector TDR’s for.
Simulations and Software CBM Collaboration Meeting, GSI, 17 October 2008 Volker Friese Simulations Software Computing.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 WP8 - Demonstration ALICE – Evolving.
NEC' /09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
ANALYSIS TOOLS FOR THE LHC EXPERIMENTS Dietrich Liko / CERN IT.
Computing for Alice at GSI (Proposal) (Marian Ivanov)
Production Activities and Results by ALICE Patricia Méndez Lorenzo (on behalf of the ALICE Collaboration) Service Challenge Technical Meeting CERN, 15.
M. Oldenburg GridPP Metadata Workshop — July 4–7 2006, Oxford University 1 Markus Oldenburg GridPP Metadata Workshop July 4–7 2006, Oxford University ALICE.
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Summary of User Requirements for Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Offline Week Alignment and Calibration Workshop February.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
Summary of Workshop on Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Computing Day February 28 th 2005.
The Worldwide LHC Computing Grid Introduction & Housekeeping Collaboration Workshop, Jan 2007.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
ALICE experiences with CASTOR2 Latchezar Betev ALICE.
ALICE Computing Status1 ALICE Computing Status Are we ready? What about our choices? Workshop on LHC Computing 26 October Ren é Brun CERN Several slides.
ALICE Computing TDR Federico Carminati June 29, 2005.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
Gestion des jobs grille CMS and Alice Artem Trunov CMS and Alice support.
F.Carminati LHCC Review of the Experiment Computing Needs January 18, 2005 Overview of the ALICE Computing Model.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
The ALICE Production Patricia Méndez Lorenzo (CERN, IT/PSS) On behalf of the ALICE Offline Project LCG-France Workshop Clermont, 14th March 2007.
ALICE Full Dress Rehearsal ALICE TF Meeting 02/08/07.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
Predrag Buncic CERN Plans for Run2 and the ALICE upgrade in Run3 ALICE Tier-1/Tier-2 Workshop February 2015.
Care and feeding of the alice grid Torino, Jan 15-16, 2009.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
The ALICE Analysis -- News from the battlefield Federico Carminati for the ALICE Computing Project CHEP 2010 – Taiwan.
ALICE and LCG Stefano Bagnasco I.N.F.N. Torino
ALICE analysis preservation
INFN-GRID Workshop Bari, October, 26, 2004
Alice DC Status P. Cerello March 19th, 2004.
Calibrating ALICE.
ALICE Physics Data Challenge 3
ALICE – Evolving towards the use of EDG/LCG - the Data Challenge 2004
Commissioning of the ALICE HLT, TPC and PHOS systems
AliRoot status and PDC’04
MC data production, reconstruction and analysis - lessons from PDC’04
April HEPCG Workshop 2006 GSI
Simulation use cases for T2 in ALICE
ALICE Computing Model in Run3
ALICE Computing Upgrade Predrag Buncic
R. Graciani for LHCb Mumbay, Feb 2006
Offline framework for conditions data
Presentation transcript:

The ALICE Computing F.Carminati May 4, 2006 Madrid, Spain

CIEMAT level 0 - special hardware 8 kHz (160 GB/sec) level 1 - embedded processors level 2 - PCs 200 Hz (4 GB/sec) 30 Hz (2.5 GB/sec) 30 Hz (1.25 GB/sec) data recording & offline analysis Total weight10,000t Overall diameter 16.00m Overall length25m Magnetic Field0.4Tesla ALICE Collaboration ~ 1/2 ATLAS, CMS, ~ 2x LHCb ~1000 people, 30 countries, ~ 80 Institutes

CIEMAT The history Developed since 1998 along a coherent line Developed in close collaboration with the ROOT team No separate physics and computing team –Minimise communication problems –May lead to “double counting” of people Used for the TDR’s of all detectors and Computing TDR simulations and reconstructions

CIEMAT The framework ROOT AliRoot STEER Virtual MC G3G4 FLUKA HIJING MEVSIM PYTHIA6 PDF EVGEN HBTP HBTAN ISAJET AliEn + LCG EMCALZDCITSPHOSTRDTOFRICH ESD AliAnalysis AliReconstruction PMD CRTFMDMUONTPCSTARTRALICE STRUCT AliSimulation JETAN

CIEMAT The code 0.5MLOC C++ 0.5MLOC “vintage” FORTRAN code Nightly builds Strict coding conventions Subset of C++ (no templates, STL or exceptions!) –“Simple” C++, fast compilation and link (see R.Brun’s talk) –No configuration management tools (only cvs) –aliroot is a single package to install Maintained on several systems –DEC-Tru64, Mac OSX, Linux RH/SLC/Fedora (i32:i64:AMD), Sun Solaris 30% developed at CERN and 70% outside

CIEMAT The tools Coding convention checker Reverse engineering Smell detection Branch instrumentation Genetic testing (in preparation) Aspect Oriented Programming (in preparation)

CIEMAT The Simulation User Code VMC Geometrical Modeller G3 G3 transport G4 transport G4 FLUKA transport FLUKA Reconstruction Visualisation Generators See A.Morsch’s talk

CIEMAT

CIEMAT TGeo modeller

CIEMAT The reconstruction Incremental process –Forward propagation towards to the vertex TPC  ITS –Back propagation ITS  TPC  TRD  TOF –Refit inward TOF  TRD  TPC  ITS Continuous seeding –Track segment finding in all detectors Best track 1 Best track 2 Conflict ! TRD TPC ITS TOF Combinatorial tracking in ITS –Weighted two-tracks  2 calculated –Effective probability of cluster sharing –Probability not to cross given layer for secondary particles See P.Hristov’s talk

CIEMAT Calibration DAQ Trigger DCS ECS Physics data DCDB AliEn+LCG metadata file store calibration procedures calibration files AliRoot Calibration classes API files From URs: Source, volume, granularity, update frequency, access pattern, runtime environment and dependencies API – Application Program Interface API HLT shuttle

CIEMAT Alignment Simulation Ideal Geometry Misalignment Reconstruction Raw data File from survey Ideal Geometry Alignment procedure

CIEMAT Tag architecture ev#guidTag1, tag2, tag3… ev#guidTag1, tag2, tag3… ev#guidTag1, tag2, tag3… ev#guidTag1, tag2, tag3… Reconstruction Bitmap Index Index builder Analysis job Selection List of ev#guid’s proof#1 proof#2 proof#3 … proof#n guid#{ev1…evn}

CIEMAT Visualisation See M.Tadel’s talk

CIEMAT ALICE Analysis Basic Concepts Analysis Models –Prompt reco/analysis at T0 using PROOF infrastructure –Batch Analysis using GRID infrastructure –Interactive Analysis using PROOF(+GRID) infrastructure User Interface –ALICE User access any GRID Infrastructure via AliEn or ROOT/PROOF UIs AliEn –Native and “GRID on a GRID” (LCG/EGEE, ARC, OSG) –integrate as much as possible common components LFC, FTS, WMS, MonALISA... PROOF/ROOT –single + multitier static and dynamic PROOF cluster –GRID API class TGrid(virtual)  TAliEn(real)  p 

CIEMAT If you thought this was difficult... NA49 experiment: A Pb-Pb event

CIEMAT ALICE Pb-Pb central event N ch (-0.5<  <0.5)=8000 … then what about this!

CIEMAT ALICE Collaboration ~ 1000 Members (63% from CERN MS) ~30 Countries ~80 Institutes

CIEMAT CERN computing power “High throughput” computing based on reliable commercial components More tha 1500 double CPU PC’s –5000 in 2007 More than 3 PB of data on disks & tapes –> 15 PB in 2007 Far from enough!

CIEMAT EGEE production service >180 sites > CPUs ~ jobs completed per day 20 VOs >800 registered users that represent thousand of scientists Situation 20 September 2005

CIEMAT ALICE view on the current situation EDG AliEn Exp specific services LCG AliEn arch + LCG code  EGEE Exp specific services (AliEn’ for ALICE) EGEE, ARC, OSG…

CIEMAT Job 1.1lfn1 Job 1.2lfn2 Job 1.3lfn3, lfn4 Job 2.1lfn1, lfn3 Job 2.1lfn2, lfn4 Job 3.1lfn1, lfn3 Job 3.2lfn2 Site ALICE central services ALICE Grid Optimizer Computing Agent RB CE WN Execs agent Submits job User ALICE Job Catalogue VO-Box LCG User Job ALICE catalogues Registers output lfnguid{se’s} lfnguid{se’s} lfnguid{se’s} lfnguid{se’s} lfnguid{se’s} ALICE File Catalogue packman SA xrootd GUID LFC SRM MSS File access Workload request SURL

CIEMAT File Catalogue query CE and SE processing User job (many events) Data set (ESD’s, AOD’s) Job Optimizer Sub-job 1 Sub-job 2Sub-job n CE and SE processing CE and SE processing Job Broker Grouped by SE files location Submit to CE with closest SE Output file 1Output file 2Output file n File merging job Job output Distributed analysis processing

CIEMAT Data Challenge Last (!) exercise before data taking Test of the system started with simulation Up to 3600 jobs running in parallel Next will be reconstruction and analysis

CIEMAT ALICE computing model For pp similar to the other experiments –Quasi-online data distribution and first reconstruction at T0 –Further reconstructions at T1’s For AA different model –Calibration, alignment, pilot reconstructions and partial data export during data taking –Data distribution and first reconstruction at T0 in the four months after AA run (shutdown) –Further reconstructions at T1’s T0: First pass reconstruction, storage of RAW, calibration data and first-pass ESD’s T1: Subsequent reconstructions and scheduled analysis, storage of a collective copy of RAW and one copy of data to be safely kept, disk replicas of ESD’s and AOD’s T2: Simulation and end-user analysis, disk replicas of ESD’s and AOD’s

CIEMAT Production Environment Coord. Production environment (simulation, reconstruction & analysis) Distributed computing environment Database organisation Detector Projects Framework & Infrastructure Coord. Framework development (simulation, reconstruction & analysis) Persistency technology Computing data challenges Industrial joint projects Tech. Tracking Documentation Simulation Coord. Detector Simulation Physics simulation Physics validation GEANT 4 integration FLUKA integration Radiation Studies Geometrical modeler International Computing Board DAQ Reconstruction & Physics Soft Coord. Tracking Detector reconstruction Global reconstruction Analysis tools Analysis algorithms Physics data challenges Calibration & alignment algorithms Management Board Regional Tiers Offline Board Chair: Comp Coord Software Projects HLT LCG SC2, PEB, GDB, POB Core Computing and Software EU Grid coord. US Grid coord. Offline Coordination Resource planning Relation with funding agencies Relations with C-RRB Offline Coord. (Deputy PL)

CIEMAT Conclusions ALICE has followed a single evolution line since eight years Most of the initial choices have been validated by our experience Some parts of the framework still have to be populated by the sub-detectors Wish us good luck!

CIEMAT