Presentation is loading. Please wait.

Presentation is loading. Please wait.

NEC'2001 12-18/09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna.

Similar presentations


Presentation on theme: "NEC'2001 12-18/09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna."— Presentation transcript:

1 NEC'2001 12-18/09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna

2 NEC'2001 12-18/09P.Hristov2 Alice detector

3 NEC'2001 12-18/09P.Hristov3 Quark gluon plasma:signatures  (Multi)Strangeness enhancement: W,X,L,K  Quarkonium suppression: J/ y,c c, U $ Kinematic probes: pT, dN/dy,dE T /dy spectra, two particle correlations, etc.  Resonance shapes and positions: r,w,f $ EM probes:photons, lepton pairs $ Hyper-matter: strangelets(2u2d2s,...), multihyperon clusters $ => more than 10 event classes

4 NEC'2001 12-18/09P.Hristov4 Alice run $ Heavy ion run 4 weeks/year (10^6 s) $ Hadronic physics: 2x10^7 central events x 40Mb compressed raw data $ Reference minimum bias events: 2x10^7 events x 10Mb $ Charm and dielectron physics: 2x10^7 events x 2.5Mb $ Muon physics: 10^9 events x 0.5Mb $ Proton run (10^7 s/year): 5x10^9 events x 0.2Mb

5 NEC'2001 12-18/09P.Hristov5 Alice off-line project $ Initial constraints $ Huge amount of data to be processed $ Very complex detector (typical for the LHC experiments) $ Experiment driven physics $ Large variety of event types $ Fuzzy and changing requirements $ Need for single, modular, flexible, and reliable framework

6 NEC'2001 12-18/09P.Hristov6 Brief history of the off-line project $ Before 1998: several independent Fortran codes for preparation of the Technical proposal $ Beginning of 1998: decision on a single framework for simulation, reconstruction, and analysis: $ OO design $ Implemented in C++ $ Based on Root $ Interfaces to the existing Fortran code and smooth transition to C++

7 NEC'2001 12-18/09P.Hristov7 Root $ Super-PAW functionality $ Hierarchical object- oriented database $ C++ interpretator $ Advanced statistical analysis tool $ Optimized query mechanisms in very large data sets (trees) $ Rich set of container classes with I/O support $ Robust object I/O package $ Extensive set of GUI classes and visualization tools $ HTML documentation system $ Run-time object inspection capabilities and automatic schema evolution $ Client/server networking $ Remote database access $ http://root.cern.ch http://root.cern.ch

8 NEC'2001 12-18/09P.Hristov8 Alice off-line project: AliRoot $ Rapid development: 434 classes, 290K lines C++ $ Software framework for all activities: simulation, digitization, monitoring, reconstruction, and analysis $ Opens source project available via CVS repository $ Decomposition per detector to independent modules

9 NEC'2001 12-18/09P.Hristov9 A view of the offline processing

10 NEC'2001 12-18/09P.Hristov10 Simulation $ Event generators $ HIJING, MEVSIM, PYTHIA: for "real physics" $ HIJING parametrization for simulation of background events $ EventCocktail with fixed number of particles $ Hits & Digits $ Geant3: stable, well known physics and limitations $ Geant4: new, evolving $ FLUKA: excellent physics, limited geometrical and user's interfaces

11 NEC'2001 12-18/09P.Hristov11 Alice virtual MC Detector Code AliRun AliMC TGeant3 TGeant4 TFluka G3 geometry G4 geometry

12 NEC'2001 12-18/09P.Hristov12

13 NEC'2001 12-18/09P.Hristov13 Alice background event $ 84210 particles $ Data size $ Hits ~1.4Gb $ Digits ~1.1Gb $ CPU time on 800MHz PIII $ Hits ~24h $ Digits ~15h

14 NEC'2001 12-18/09P.Hristov14 Simulation strategy $ Simulation of background events $ 10 event classes x 1000 background events $ Event mixing (merging) technique $ Creation of summable digits: provided for each detector $ Merging of signal patterns to the background event $ Possibility for multilayer merging $ Different storage options $ Fast parametrized MC of the response

15 NEC'2001 12-18/09P.Hristov15 Simulation: Ongoing activities $ Development and interface of after-burners $ Development and interface of flow generators $ Geant 4 integration; hadronic tests and benchmarks $ TFluka interface and integration $ Radiation studies $ Improvement of the geometrical descriptions; investigation on geometrical modellers $ Implementation of the event merging $ Refactoring of the existing code

16 NEC'2001 12-18/09P.Hristov16 Reconstruction $ Exists for all detectors: track segments $ Much faster than the simulation and digitization (TPC: 30 min/central event) $ Reconstructed event size: 4Mb for the data events, 5Mb for the simulated ones $ Software tests of the quality $ Global tracking: ITS+TPC+TRD in progress

17 NEC'2001 12-18/09P.Hristov17 Tracking requirements (ITS&TPC) $ Environment $ dN/dy up to 8000 charged particle (about 4 times that foreseen for RHIC at BNL) $ Requirements $ good efficiency (above 90% from p T > 0.1GeV/c ) $ some efficiency also below (as low as possible)  momentum resolution (  p/p) on the level of 1% for low momenta and for high momenta few % at 5GeV/c $ Good secondary vertexing capability (V0, charm ?) $ particle identification capabilities, especially for low momentum electrons (which cannot reach other PID)

18 NEC'2001 12-18/09P.Hristov18 TPC tracking

19 NEC'2001 12-18/09P.Hristov19 dE/dx in TPC

20 NEC'2001 12-18/09P.Hristov20 Reconstruction: Ongoing activities $ Cluster unfolding $ Tracks from secondary vertexes $ Global Kalman track and error propagation $ Primary and secondary vertexing $ Treatment of kinks $ Fitting with constraints $ Classes for High Level Triggering $ Calibration & alignment

21 NEC'2001 12-18/09P.Hristov21 Analysis $ Development of AliEvent class for global reconstruction $ Specific classes for different signatures $ Physics Performance Report in 2002 $ Detector features: resolutions, efficiencies, etc. $ Detailed study of the possibilities to extract QGP signals and parameters $ Test for the production and distributed computing environment

22 NEC'2001 12-18/09P.Hristov22 Distributed analysis project Selection Parameters DB1 DB4 DB5 DB6 CPU Local Remote Procedure Proc.C PROOF CPU TagDB RDB DB3 DB2 Proc.C Results

23 NEC'2001 12-18/09P.Hristov23 Organization of the Alice off- line project $ Small central team at CERN: coordination, infrastructure, framework development, software tests, platforms $ Detector groups: geometrical description, simulation, reconstruction, specific algorithms and tests $ Off-line board: coordination, major design decisions, manpower planning $ Working groups on specific problems

24 NEC'2001 12-18/09P.Hristov24 Alice computing structure Production Environment & Coordination • Simulation production • Reconstruction production • Production farms • Database organization • Relations with IT & other comp. Centers Detector Projects Project Coordination Framework & Infrastructure • Framework development • Database technology • HLT Farm • Distributed computing (GRID) • Data Challenge • Industrial joint Projects • Tech. Tracking • Documentation Simulation • Detector Simulation • Physics simulation • Physics validation • GEANT 4 integration • FLUKA integration • Radiation Studies • Geometrical modeller GRID & World Computing Offline Board EC DataGRID WP8 Coordination DAQ Reconstruction & Physics • Tracking • Detector reconstruction • HLT algorithms • Global reconstruction • Analysis tools • Analysis algorithms • Calibration & alignment algorithms Technical Support Technical Board Physics Board Regional Tiers ROOT FrameWk Data Challenges HLT algorithms

25 NEC'2001 12-18/09P.Hristov25 Development cycle $ Rapid prototyping, rapid user's feed-back $ Micro-cycles: constant update via our CVS server $ Macro-cycles: major modifications, design choices, software releases $ Code and design reviews, refactoring Design Implement Debug Use Canoni cal entry point User entry point

26 NEC'2001 12-18/09P.Hristov26 Alice software process $ Close in spirit to the most recent software engineering methodology (Extreme programming, GNU, Open source, etc.) $ Extensive Web-based documentation updated nightly http://AliSoft.cern.ch/offline $ Continuous code testing (nightly builds) $ Coding conventions: code analysis and reverse engineering tool (common project with IRST, Trento)

27 NEC'2001 12-18/09P.Hristov27 Planning $ Based mainly on functionality milestones for technology and physics $ Data challenges: test of the technology and functional integration DAQ - Off-line $ Physics challenges: test of the functionality from the user's point of view. The Physics Performance Report is the first one $ Detailed planning taking into account the current user's requirements (lightweight use cases) $ Major release every 6 months, tags every week

28 NEC'2001 12-18/09P.Hristov28 Alice data challenges: milestones ADC Milestone Bandwidth Target (MB/s to mass storage) Architecture Objective 5/2002200Integration of single detector HLT code, at least for TPC and ITS. Quasi on-line reconstruction at CERN Partial data replication to remote centers. Raw digits for TPC and ITS. 5/2003300HLT prototype for all detectors that plan to make use of it. Remote reconstruction of partial data streams. Raw digits for barrel and MUON. 5/2004450Prototype of the final HLT system. Prototype of the final remote data replication. Raw digits for all detectors. 5/2005750 1250 Final system.

29 NEC'2001 12-18/09P.Hristov29 Physics challenges: milestones PDC PeriodPDC MilestonePhysics Objective 09/01-12/0101/2002First test of the complete chain from simulation to reconstruction for the PPR Event merging. Simple analysis. ROOT digits. 06/03-12/0301/2004Complete chain used for trigger studies. Prototype of the analysis tools. Comparison with parametrized MonteCarlo. Simulated raw data. 06/05-12/0501/2006Test of the final system for reconstruction and analysis.


Download ppt "NEC'2001 12-18/09P.Hristov1 Alice off-line computing Alice Collaboration, presented by P.Hristov, CERN NEC'2001 September 12-18, Varna."

Similar presentations


Ads by Google