Simulations and Software CBM Collaboration Meeting, GSI, 17 October 2008 Volker Friese Simulations Software Computing.

Slides:



Advertisements
Similar presentations
Grid simulation (AliEn) Eugen Mudnić Technical university Split -FESB.
Advertisements

MotoHawk Training Model-Based Design of Embedded Systems.
Status and roadmap of the AlFa Framework Mohammad Al-Turany GSI-IT/CERN-PH-AIP.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
IFluka : a C++ interface between Fairroot and Fluka Motivations Design The CBM case: –Geometry implementation –Settings for radiation studies –Global diagnosis.
CBM Collaboration Meeting 1 Simulation Status V. Friese.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
IFluka : a C++ interface between Fairroot and Fluka Motivations Design The CBM case: –Geometry implementation –Settings for radiation studies –Global diagnosis.
1 Read out & data analysis of the MVD demonstrator S. Amar-Youcef, M. Deveaux, I. Fröhlich, J. Michel, C. Müntz, C. Schrader, S. Seddiki, T. Tischler,
TOF Meeting, 9 December 2009, CERN Chiara Zampolli for the ALICE-TOF.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
CBM Software Workshop for Future Challenges in Tracking and Trigger Concepts, GSI, 9 June 2010 Volker Friese.
Charmonium feasibility study F. Guber, E. Karpechev, A.Kurepin, A. Maevskaia Institute for Nuclear Research RAS, Moscow CBM collaboration meeting 11 February.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Andreas Morsch, CERN EP/AIP CHEP 2003 Simulation in ALICE Andreas Morsch For the ALICE Offline Project 2003 Conference for Computing in High Energy and.
17-Aug-00 L.RistoriCDF Trigger Workshop1 SVT: current hardware status CRNowFinal Hit Finders64242 Mergers31616 Sequencers2312 AMboards4624 Hit Buffers21212.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Status of Reconstruction in CBM
Simulations for CBM CBM-India Meeting, Jammu, 12 February 2008 V. Friese
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
Bernhard Schmidt DESY - HH PRC open session, October 30, 2002 HERA-B.
1 The PHENIX Experiment in the RHIC Run 7 Martin L. Purschke, Brookhaven National Laboratory for the PHENIX Collaboration RHIC from space Long Island,
Online Reconstruction 1M.Ellis - CM th October 2008.
LM Feb SSD status and Plans for Year 5 Lilian Martin - SUBATECH STAR Collaboration Meeting BNL - February 2005.
Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.
High-resolution, fast and radiation-hard silicon tracking station CBM collaboration meeting March 2005 STS working group.
J/psi trigger status Alla Maevskaya INR RAS 11 March 2009 CBM Collaboration meeting.
CBM Simulation Walter F.J. Müller, GSI CBM Simulation Week, May 10-14, 2004 Tasks and Concepts.
CBM Computing Model First Thoughts CBM Collaboration Meeting, Trogir, 9 October 2009 Volker Friese.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
CBM-Meet, VECC July 21, Premomoy Ghosh CBM – MUCH Simulation for Low-mass Vector Meson Work done at GSI during June 2006.
Realistic detector layout for MUCH system M.Ryzhinskiy, SPbSPU CBM Collaboration Meeting September 25-28, 2007, Dresden, Germany.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
Computing for Alice at GSI (Proposal) (Marian Ivanov)
Workflows and Data Management. Workflow and DM Run3 and after: conditions m LHCb major upgrade is for Run3 (2020 horizon)! o Luminosity x 5 ( )
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
The MEG Offline Project General Architecture Offline Organization Responsibilities Milestones PSI 2/7/2004Corrado Gatto INFN.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
UK LVL1 Meeting, RAL, 31/01/00Alan Watson 1 ATLAS Trigger Simulations Present & Future? What tools exist? What are they good for? What are the limitations?
Alien and GSI Marian Ivanov. Outlook GSI experience Alien experience Proposals for further improvement.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
21/5/03J.E Campagne Opera Framework: Status Report.
The Helmholtz Association Project „Large Scale Data Management and Analysis“ (LSDMA) Kilian Schwarz, GSI; Christopher Jung, KIT.
1 Calice TB Review DESY 15/6/06D.R. Ward David Ward Post mortem on May’06 DESY running. What’s still needed for DESY analysis? What’s needed for CERN data.
LIT participation LIT participation Ivanov V.V. Laboratory of Information Technologies Meeting on proposal of the setup preparation for external beams.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
CBM Experiment Computing Topics Volker Friese GSI Darmstadt HIC4FAIR Physics Day FIAS, Frankfurt, 11 November 2014.
Event reconstruction for the RICH prototype beamtest data 2014
– a CBM full system test-setup at GSI
LHC experiments Requirements and Concepts ALICE
Calibrating ALICE.
ALICE – First paper.
ALICE Physics Data Challenge 3
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Simulation use cases for T2 in ALICE
ALICE Computing Upgrade Predrag Buncic

MonteCarlo production for the BaBar experiment on the Italian grid
CBM Computing Overview and Status V
Production Manager Tools (New Architecture)
BES III Software: Short-term Plan ( )
The LHCb Computing Data Challenge DC06
Presentation transcript:

Simulations and Software CBM Collaboration Meeting, GSI, 17 October 2008 Volker Friese Simulations Software Computing

Volker Friese CBM Collaboration Meeting, 17 October Simulations Key observables well in hands; many studies ongoing –to be ctd. with ever more realistic detector descriptions Sufficient information delivered for start of detector engineering Trigger considerations (charmonium, open charm) under way Look into running at SIS100: promising First steps to study so far uncovered topics: –centrality determination –event plane resolution –flow –TOF with start detector

Volker Friese CBM Collaboration Meeting, 17 October Software status Software development is still rapid Focus shifted from detector description to reconstruction / analysis Needed / ongoing –consolidation / cleanup –control –quality assessment –documentation

Volker Friese CBM Collaboration Meeting, 17 October Tools Event generators: –Strategy of adding signal to background event (UrQMD) not valid for low-multiplicity events (e. g. p+C). Look for realistic generators for such events. –For PSD studies: proper fragmentation model needed. SHIELD is ok, but does not run with TGeant3 MC engines: –TGeant3: our baseline. G3 not developed any longer –TGeant4: works (some problems), but physics list to be determined –TFluka: not operational. –Native Fluka used for radiation level studies

Volker Friese CBM Collaboration Meeting, 17 October Detector Description SystemMC GeometryDigitisation MVDMonolithic Digitiser charge sharing,clustering STS Segmented, passive materials Digitiser, charge sharing,clustering RICHPassive materialsHitProducer MUCHSegmented Digitiser, charge sharing,clustering TRDSegmentedHitProducer TOF Segmented, passive materials Hit Producer (advancd) ECALSegmentedShower model PSDSegmentedDigitiser

Volker Friese CBM Collaboration Meeting, 17 October Software status Geometrical description fairly well advanced; supports etc. mostly missing Detector response models ("digitizer") implemented for almost all subsystems Parameters taken from literature or educated guesses Have to be tuned to detector tests / prototype measurements

Volker Friese CBM Collaboration Meeting, 17 October Against all odds... Test beam time, September 2008 First data taking with untriggered, free-streaming DAQ Worked in principle (many open questions yet) First glimpse on such a data stream

Volker Friese CBM Collaboration Meeting, 17 October Towards modeling the data stream eventwise Event Generator MC Transport Reconstruction Digitisation Analysis eventwise Experiment eventwise Reconstruction Analysis free streaming Event builder eventwise Event Generator MC Transport Event builder Digitisation Reconstruction eventwise free streaming

Volker Friese CBM Collaboration Meeting, 17 October Online and Offline Online reconstruction (L1 / Hough) will not be implemented on "normal" architectures Implementation on FPGA / multi-core requires dedicated programming languages Up to now, models for the algorithms are implemented in CBMROOT. Will not continue to be so (?) Integration / connection of framework and online software to be rethought

Volker Friese CBM Collaboration Meeting, 17 October Computing CBM computing model to be worked out Some facts: –1 TB/s from detector –archival rate 1 GB/s  5 PB per CBM year Online processing will (most probably) be on site Reconstruction can in principle be distributed Analysis should be distributed Can full reconstruction be done online? If yes, will raw data be stored? Connections to FAIR computing concept?

Volker Friese CBM Collaboration Meeting, 17 October FANCy Proposal submitted September 2008 CBM is used as showcase

Volker Friese CBM Collaboration Meeting, 17 October CBM GRID Aims: –facilitate simulations by use of ressources other than GSI –enable larger statistics: 10 5 events  10 7 events –gain experience for using distributed computing for real data processing Status: –Central services installed at GSI, tests ongoing Perspectives: –2008: Small test grid (3-4 sites), test data challenge –2009: Production mode

CBM Grid Structure (work started 2008) K. Schwarz, F. Uhlig

running site GSI Computing Element (CE) Storage Element (SE) Packman (for installing experiment software on the Grid (CBMRoot) FTD (File Transfer Daemon) (intersite transfer) –transfer protocoll: xrootd first jobs have run successfully !!! first job output has been stored successfully at CBMGrid Storage Elements K. Schwarz, F. Uhlig

CBMGrid central services K. Schwarz, F. Uhlig