04/09/2007 Reconstruction of LHC events at CMS Tommaso Boccali - INFN Pisa Shahram Rahatlou - Roma University Lucia Silvestris - INFN Bari On behalf of.

Slides:



Advertisements
Similar presentations
First results from the ATLAS experiment at the LHC
Advertisements

Freiburg Seminar, Sept Sascha Caron Finding the Higgs or something else ideas to improve the discovery ideas to improve the discovery potential at.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
September 27, 2005FAKT 2005, ViennaSlide 1 B-Tagging and ttH, H → bb Analysis on Fully Simulated Events in the ATLAS Experiment A.H. Wildauer Universität.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
Daniele Benedetti CMS and University of Perugia Chicago 07/02/2004 High Level Trigger for the ttH channel in fully hadronic decay at LHC with the CMS detector.
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
Lucia Silvestris, INFN Bari and CERN/CMC Status Report on CPT Project 23 March 2001, CERN Meeting del Consorzio INFN Status Reports on CPT Project, on.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
1 CMS Tracker Alignment and Implications for Physics Performance Nhan Tran Johns Hopkins University CMS Collaboration SPLIT
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
Trigger & Analysis Avi Yagil UCSD 2nd Hadron Collider Physics Summer School June 2007.
Alexander Khanov 25 April 2003 DIS’03, St.Petersburg 1 Recent B Physics results from DØ The B Physics program in D Ø Run II Current analyses – First results.
Claudio Grandi INFN Bologna CMS Operations Update Ian Fisk, Claudio Grandi 1.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Alignment Strategy for ATLAS: Detector Description and Database Issues
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
The Region of Interest Strategy for the ATLAS Second Level Trigger
1 M. Paganoni, HCP2007 Computing tools and analysis architectures: the CMS computing strategy M. Paganoni HCP2007 La Biodola, 23/5/2007.
CDF Offline Production Farms Stephen Wolbers for the CDF Production Farms Group May 30, 2001.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Optimizing CMS Data Formats for Analysis Peerut Boonchokchuay August 11 th,
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Introduction to CMSSW Framework Concepts Simulation & Reconstruction Liz Sexton-Kennedy January 10, 2008.
Feb. 7, 2007First GLAST symposium1 Measuring the PSF and the energy resolution with the GLAST-LAT Calibration Unit Ph. Bruel on behalf of the beam test.
FIMCMS, 26 May, 2008 S. Lehti HIP Charged Higgs Project Preparative Analysis for Background Measurements with Data R.Kinnunen, M. Kortelainen, S. Lehti,
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
ATLAS Trigger Development
Features needed in the “final” AliRoot release P.Hristov 26/10/2006.
The CMS Computing System: getting ready for Data Analysis Matthias Kasemann CERN/DESY.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
CMS Computing Model summary UKI Monthly Operations Meeting Olivier van der Aa.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
DØ Beauty Physics in Run II Rick Jesik Imperial College BEACH 2002 V International Conference on Hyperons, Charm and Beauty Hadrons Vancouver, BC, June.
Tommaso Boccali SNS and INFN Pisa Vertex 2002 Kailua-Kona, Hawaii 3-8 November 2002 High Level Triggers at CMS with the Tracker.
Tim Christiansen (CERN), Claudio Campagnari (UCSB), and Benedikt Hegner (CERN) for the Top-Physics Group AOD/PAT-tuples: Top-PAG Plans and Needs for the.
GDB, 07/06/06 CMS Centre Roles à CMS data hierarchy: n RAW (1.5/2MB) -> RECO (0.2/0.4MB) -> AOD (50kB)-> TAG à Tier-0 role: n First-pass.
USCMS May 2002Jim Branson 1 Physics in US CMS US CMS Annual Collaboration Meeting May 2002 FSU Jin Branson.
Feb. 3, 2007IFC meeting1 Beam test report Ph. Bruel on behalf of the beam test working group Gamma-ray Large Area Space Telescope.
1 Status of Tracker Alignment b-tagging Workshop Nhan Tran (JHU) On behalf of the Tracker Alignment Group.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
M. Brooks, 28-Mar-02 Heavy/Light meeting 1 Muon Analysis Work Getting Code ready for first data pass - DONE Get ready for second pass on DSTs - muon identification.
Main parameters of Russian Tier2 for ATLAS (RuTier-2 model) Russia-CERN JWGC meeting A.Minaenko IHEP (Protvino)
Computing Model José M. Hernández CIEMAT, Madrid On behalf of the CMS Collaboration XV International Conference on Computing in High Energy and Nuclear.
16 September 2014 Ian Bird; SPC1. General ALICE and LHCb detector upgrades during LS2  Plans for changing computing strategies more advanced CMS and.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
1 June 11/Ian Fisk CMS Model and the Network Ian Fisk.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
D. Elia (INFN Bari)Offline week / CERN Status of the SPD Offline Domenico Elia (INFN Bari) Overview:  Response simulation (timing info, dead/noisy.
 reconstruction and identification in CMS A.Nikitenko, Imperial College. LHC Days in Split 1.
David Lange Lawrence Livermore National Laboratory
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
1 M. Paganoni, 17/1/08 Modello di calcolo di CMS M. Paganoni Workshop Storage T2 - 17/01/08.
CMS High Level Trigger Configuration Management
evoluzione modello per Run3 LHC
Controlling a large CPU farm using industrial tools
ALICE – First paper.
AliRoot status and PDC’04
Presentation transcript:

04/09/2007 Reconstruction of LHC events at CMS Tommaso Boccali - INFN Pisa Shahram Rahatlou - Roma University Lucia Silvestris - INFN Bari On behalf of CMS Offline group

04/09/2007CMS ReconstructionL. Silvestris 2 Outline Software for CMS reconstruction: use cases and needed performances The continued effort towards Data Taking Data Challenges and preparation for Physics

04/09/2007CMS ReconstructionL. Silvestris 3 Requirements for a Reconstruction Software 1. Allow the processing of Raw Data to produce Data Objects useful for analyses 2. Allow the use of the full discriminating power of the CMS sub detectors in Physics measurements in offline analyses 3. … but also something CMS: HLT use case The LHC collisions will 40 MHz, while the offline system can stream data to disk only at Hz

04/09/2007CMS ReconstructionL. Silvestris 4 CMS Trigger Strategy CMS has chosen a trigger sequence in which, after a L1 (hardware based) response, reducing the events from 40 MHz to 100 kHz, the offline reconstruction code runs to provide the factor 1000 reduction to Hz In this way: We develop a single software for HLT and Offline We develop a single software for HLT and Offline We can use the full complexity of Oflline Trigger level We can use the full complexity of Oflline Trigger level We can scale the system adding CPUs We can scale the system adding CPUs CMS HLT functionality depends on data rate and CPU resources available Traditional L2 functionality is the most challenging parameter. Does not benefits of full-granularity

04/09/2007CMS ReconstructionL. Silvestris 5 Computing Model CERN FilterFarm (HL Trigger) Cern Analysis Facility (small turn around time activities - typically calibrations) T0 activities (prompt reconstruction) Raw data custodial ~ 7 MSi2k (2008) ~ 5 PB on Tape; ~ 2 PB of disk (2008) National / Super national level - Tier 1s Raw data custodial (shared) Re-Reconstruction Skimming / RECO/AOD production ~ 10 MSi2k (2008) ~ 10 PB on Tape; ~ 5 PB of Disk (2008) Community based - Tier 2s Simulation activities Analysis facilities ~ 15 MSi2k (2008) No tape; ~ 5 PB of Disk (2008) CAF Tier 1 Tier 2 Tier 3,4,.. Tier CPU ~ 4Si2k

04/09/2007CMS ReconstructionL. Silvestris 6 CMS Data Tiers CMS plans to implement a hierarchy of Data Tiers Raw Data: as from the Detector Full Event: contains Raw plus all the objects created by the Reconstruction pass RECO: contains a subset of the Full Event, sufficient for reapplying calibrations after reprocessing “Refitting but not re-tracking” AOD: a subset of RECO, sufficient for the large majority of “standard” physics analyses Contains tracks, vertices etc and in general enough info to (for example) apply a different b-tagging Can contain very partial hit level information RAW RECO AOD ~1.5 MB/event ~ 500 kB/event ~ 100 kB/event

04/09/2007CMS ReconstructionL. Silvestris 7 Software environments The code must be able to run in three different scenarios:  Bare root: open the POOL file, and inspect the Data Objects _without any CMS infrastructure_  Portable on any machine which runs root, without _any_ effort  CMSSW-Lite: load a small number of libraries, don’t allow access to any calibration, mag field map etc, but have full access to Physics Objects  Your laptop when you are not connected to internet  Full-CMSSW: full access to calibrations and full availability of libraries. Used mainly to produce reconstructed objects from RawData to lower Tiers  Your favourite T2

04/09/2007CMS ReconstructionL. Silvestris 8 CHEP06 Last CHEP06 report New software being written (CMS transitioned to new framework in early 2005) Local reconstruction ok with simulated events (no infrastructure for calibrations and realistic conditions) Higher level reconstruction only sketched Jets working (guinea pig) Tracks “coming” (== under debug) All the rest (higher algos) missing Number of Jets Jet Energy (GeV) 300 GeV Pions LED Calibration Events Cosmic ray from ECAL local reco commissioning

04/09/2007CMS ReconstructionL. Silvestris 9 Since then … CSA06 (Sept/Oct 06): Computing/Software and Analysis challenge Reconstruction enhanced with Tracking Electrons (initial version) Photons (initial version) B/tau tagging Vertexing Jets/MET First definition of data Tiers (FEVT, RECO, AOD) Re-reconstruction, skimming demonstrated Total events processed>100M Performance < 25 sec/ev (on 1kSi2k CPUs) even on ttbar Memory tops at 500 MB/job after hours/thousands of events Crash rate < /event

04/09/2007CMS ReconstructionL. Silvestris 10 Ramp up in code size / developers (Full CMSSW) Lines of code in the repository Number of developer active in a given month CHEP06 18 months = +900k lines of code 4x time the involved developers

04/09/2007CMS ReconstructionL. Silvestris 11 Now … CSA07 just taking off Includes also HLT and analysis skim workflows working from raw data Extensive physics validation on Reconstruction code Complete switch took 2 years Much more attention to calibrations System “complete” - ready to handle real data (and being tested with commissioning data) Lots of new algorithms Impossible to mention all of them Tracking optimized Electrons/photons optimal Btagging, tau tagging Many more jet algos Lots of vertexing algos Muons optimal Particle Flow …

04/09/2007CMS ReconstructionL. Silvestris 12 Few Results Single  pt 10 GeV fully efficient Tracker detector Tracking Muon PT resolution Electron classification Electron Energy P T = 10 GeV P T = 50 GeV P T = 100 GeV P T = 500 GeV P T = 1000 GeV

04/09/2007CMS ReconstructionL. Silvestris 13 Few Results Tau HLT Efficiencies (QCD and Z to tau tau) Jet resolution standard/Particle Flow Btag efficiency Track counting

04/09/2007CMS ReconstructionL. Silvestris 14 Data sizes Current figures show that we are not far from the estimated quota of 100 kB (AOD) and 500kB (RECO) (in the example, bb events w/o pile up) Even including Simulated information, which accounts up to 20% of the payload Still to be reviewed when analysis use cases will be cleared Expect some additions and a lot of cleaning Data TierSize kB/event RAWSIM1420 RECOSIM680 AODSIM100 RECO: Mostly tracks and associated hits Validation and checks release by release

04/09/2007CMS ReconstructionL. Silvestris 15 Road from here to data taking CSA07 will probe reconstruction On yet another O(100M) events With realistic calibrations loaded for all the subsystems (== much higher RAM payload, ~ 1 GB/job) With realistic misalignment scenarios Using the complete range of reconstruction / re-reco / skimming / reduced data sets production At the same time, reconstruction of data taken during commissioning (Global Runs) is certifying the correct use when starting from raw data

04/09/2007CMS ReconstructionL. Silvestris 16 Conclusions CMS Reconstruction Software has jumped from being a development prototype to a working product in the period CHEP06-CHEP07 It is actively debugged on Hundreds of Million of Simulated Events (trying to operate in 2008 real data mode) Real commissioning data (prepare/test the correct calibration/alignment infrastructure) We are eager to see the first pp data and confront with it Next CHEP you will see Reconstruction code in Action on it!