Download presentation
Presentation is loading. Please wait.
Published byShauna Dalton Modified over 9 years ago
1
Computing in CMS May 24, 2002 NorduGrid Helsinki workshop Veikko Karimäki/HIP
2
NordUGrid, Helsinki May 23-24, 2002 Slide 2Outline l CMS software overview è SIMULATION è RECONSTRUCTION è VISUALISATION è ANALYSIS l Production activities l Schedules of CMS Data Challenge
3
NordUGrid, Helsinki May 23-24, 2002 Slide 3 CMSIM (soon OSCAR) FZ signal HEPEVT Ntuple ORCA FZ signal FZ minbias OODB minbias OODB signal OODB Digis Ntuple G3Reader SimReader RecReader MC generator CMKIN Production User Simulation-Reconstruction-Analysis Chain (now) OODB Tags Generation Simulation Digitization Reconstruction
4
NordUGrid, Helsinki May 23-24, 2002 Slide 4 Software Projects l CMSIM (The original GEANT3 Simulation of CMS) l CARF (CMS Analysis and Reconstruction Framework (deprecated) è In 2001 was split out of ORCA repository to become: l COBRA (Coherent Object-oriented Base for simulation, Reconstruction and Analysis) l OSCAR (Object oriented Simulation for CMS Analysis and Reconstruction) è The GEANT4 Simulation framework for CMS l ORCA (Object Reconstruction for CMS Analysis) è The OO reconstruction program l IGUANA (Interactive Graphical User ANAlysis) è Toolkits for Interactive Analysis l FAMOS (Fast Monte-Carlo Simulation) è “Smearing” Monte-Carlo l DDD (needs an Acronym!) è The Detector Description Database See PRS talks for use of CMSIM, CARF/COBRA, ORCA New Packages, IGUANA, OSCAR, FAMOS, DDD, not yet in Physicist use, see following slides
5
NordUGrid, Helsinki May 23-24, 2002 Slide 5 Software Process Components l SCRAMToolBox è Product specification and versioning, è Product break-down structure è Configuration definition and versioning l SCRAM è Configuration specification è Assembly break-down è Mapping of local resources onto configuration requirements è Build, Source-code distribution l DAR è Self-contained binary distribution (prod) l cvs è Source code management l CVSpm è Standardized directory structures and repositories, è responsibility and access right information and control. l BugsRS è Error reporting l Insure, workshop è Memory leaks è performance monitoring, etc.. l CodeWizard è Coding rule checking l Ignominy è General dependencies è some project metrics l McCabe,and risk pages è Measurement p (OO and procedural metrics) è Risk evaluation l DepUty è Dependencies è style checking è UML package diagrams l Oval è Regression testing p unit, cluster, acceptance, verification, validation
6
NordUGrid, Helsinki May 23-24, 2002 Slide 6 ORCA Project Relations Objectivity, Anaphe, Geant4,... COBRA OSCAR FAMOS Framework Visualisation Reconstruction Simulation Fast Reco/Simu (replaces CMSIM soon)
7
NordUGrid, Helsinki May 23-24, 2002 Slide 7 Tracker: Track Reconstruction Generation of seeds (Seed Generator) l Construction of trajectories for a given seed (Trajectory Builder) l Ambiguity resolution (Trajectory Cleaner) l Final fit of trajectories (Trajectory Smoother) Each component has one or more implementation. Three different algorithms are currently fully implemented: ( Combinatorial Track Finding, Connection Machine, Deterministic Annealing Filter)
8
NordUGrid, Helsinki May 23-24, 2002 Slide 8 IGUANA Detector and Event Display
9
NordUGrid, Helsinki May 23-24, 2002 Slide 9 Tracker - analysis example Resolutions for 100 GeV muons Number of hits used
10
NordUGrid, Helsinki May 23-24, 2002 Slide 10 CMS Distributed Production (~2001) ~1100 available CPU >30 involved persons 23 Sites 11 RC (Regional Centres) Location ?21Wisconsin 4011UCSD 12021Florida 8051FNAL 34023Caltech USA 6024MoscowRussia 5012Bristol/RAL 150107INFN 9631IN2P3 1021Helsinki 20041CERN Europe
11
NordUGrid, Helsinki May 23-24, 2002 Slide 11 Production 2002, Complexity File Transfer by GDMP and by perl Scripts over scp/bbcp 15 TBData Size (Not including fz files from Simulation) ~11,000Number of Files 6-8 Number of Production Passes for each Dataset (including analysis group processing done by production) 176 CPUsLargest Local Center ~1000Number of CPU’s 21Number of Computing Centers 11Number of Regional Centers
12
NordUGrid, Helsinki May 23-24, 2002 Slide 12 TYPICAL EVENT SIZES Simulated 2 1 CMSIM event = 1 OOHit event = 1.4 MB Reconstructed 2 1 “10 33 ” event = 1.2 MB 2 1 “2x10 33 ” event = 1.6 MB 2 1 “10 34 ” event = 5.6 MB CMS Produced Data in 2001 CMS Produced Data in 2001 0.05MFlorida 0.06MUCSD 0.07MWisconsin 0.13MHelsinki 0.31MIN2P3 0.43MMoscow 0.76MINFN 1.10MCERN 1.27MBristol/RAL 1.65MFNAL 2.5MCaltech Simulated Events 0.05TBWisconsin 0.08TBFlorida 0.10TBIN2P3 0.20TBUCSD 0.22TBBristol/RAL 0.40TBINFN 0.45TBMoscow 0.60TBCaltech 12TBFNAL 14TBCERN Reconstructed with pileup
13
NordUGrid, Helsinki May 23-24, 2002 Slide 13 Production Status 2002 On schedule for June 1 deadline Imperial College Bristol/RAL Wisconsin UCSD Moscow IN2P3 INFN FNAL Florida CERN Caltech
14
NordUGrid, Helsinki May 23-24, 2002 Slide 14 Data Transfers INFN CERN FNAL Bristol/RAL Caltech Moscow IN2P3 UFL HIP WisconsinUCSD Min.Bias Objy/DB.fz files Objy/DB RC archiving data RC publishing data
15
NordUGrid, Helsinki May 23-24, 2002 Slide 15 CMS and the GRID l CMS Grid Implementation plan for 2002 published (CMS NOTE-2002/015) l Close collaboration with EDG and Griphyn/iVDGL,PPDG l Upcoming CMS GRID/Production Workshop (June CMSweek) è File Transfers p Production File Transfer Software Experiences p Production File Transfer Hardware Status & Reports p Future Evolution of File Transfer Tools è Production Tools p Monte Carlo Production System Architecture p Experiences with Tools è Monitoring / Deployment Planning p Experiences with Grid Monitoring Tools p Towards a Rational System for Tool Deployment
16
NordUGrid, Helsinki May 23-24, 2002 Slide 16 CMS - Schedule for Challenge Ramp Up l All CMS work to date with Objectivity, Now being phased out to be replaced with LCG Software è Enforced lull in production challenges p No point to do work to optimize a solution being replaced p (But much learnt in past challenges to influence new design) è Use Challenge time in 2002 to benchmark current performance è Aim to start testing new system as it becomes available p Target early 2003 for first realistic tests p Thereafter return to roughly exponential complexity ramp up to reach 50% complexity in 2005 20% Data Challenge
17
NordUGrid, Helsinki May 23-24, 2002 Slide 17 Objectivity Issues l Bleak è CERN has not renewed the Objectivity Maintenance p Old licenses are still applicable, but cannot be migrated to new hardware è Our understanding is that we can continue to use the product as before, clearly without support any longer p But cannot be used on newer RedHat OS’s (7…) (or other Linux OS’s) l Will become increasingly difficult during this year to find sufficient resources correctly configured for our Objectivity usage. l We are preparing for the demise of our Objectivity-based code by the end of this year è CMS already contributing to the new LCG Software è Aiming to have first prototypes for catalog layer by July è Initial release of CMS prototype ROOT+LCG, September
18
NordUGrid, Helsinki May 23-24, 2002 Slide 18 Planning - CMS Computing l 2002: DAQ Technical Design Report l 2003: GEANT4 validation, 5% Data Challenge start l 2004 beg: 5% Data Challenge Complete l 2004 end: Computing and Core Software (CCS) TDR submitted l 2004-2005: Physics TDR l 2005: 20% Data Challenge l 2006 beg: 20% Data Challenge Complete l 2006: CCS commissioning l 2007 beg: fully operational computing systems (20% capacity) l 2007-2008: CCS systems ramp-up l 2009 beg: CCS systems 100% operational Note: The new LHC schdule caused 9-15 months adjustements in CMS computing planning
19
NordUGrid, Helsinki May 23-24, 2002 Slide 19Summary l CMSIM/Geant3 (Fortran) to be replaced by OSCAR/Geant l Then the full chain will be in C++ l ODBS: Objectivity --> customized ROOT under work l CMS simulation mass productions well under way è 11 Regional Centres, >1000 CPU’s being used è ~30 TB of data in 2001, 15 TB in 2002 so far è ~8 M events in 2001 + ~25 M MinBias events for pile-up l Active participation in LCG l 5% Data Challenge planned for beg of 2004 l 20% Data Challenge for 2006 l From 9 to 15 months delays due to new LHC schedule
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.