M. Stavrianakou CMS Software and Computing 26.10.2005 IEEE/NSS: LHC Software: Crunch Time 1 CMS Software and Computing Outline n External (Common LHC,

Slides:



Advertisements
Similar presentations
Use of G EANT 4 in CMS AIHENP’99 Crete, April 1999 Véronique Lefébure CERN EP/CMC.
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Upgrading the CMS simulation and reconstruction David J Lange LLNL April CHEP 2015D. Lange.
LPC 1 The LHC Physics Center (LPC) Sarah Eno U Maryland ( for more information, see
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
WP2: Detector development Summary G. Pugliese INFN - Politecnico of Bari.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Software and Computing FNAL Internal Review of USCMS Software and Computing David Stickland Princeton University CMS Software and Computing Deputy.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
The CMS Level-1 Trigger System Dave Newbold, University of Bristol On behalf of the CMS collaboration.
Introduzione al Software di CMS N. Amapane. Nicola AmapaneTorino, Aprile Outline CMS Software projects The framework: overview Finding more.
LCG Meeting, May 14th 2003 V. Daniel Elvira1 G4 (OSCAR_1_4_0) Validation of CMS HCal V. Daniel Elvira Fermilab.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Alignment Strategy for ATLAS: Detector Description and Database Issues
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
The Computing Project Technical Design Report LHCC Meeting June 29, 2005.
URA Review – May 8, US CMS – RP, FNAL CMS Research Program (FNAL) Dan Green US CMS RPM May 8, 2006.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
Kati Lassila-Perini/HIP HIP CMS Software and Physics project evaluation1/ Electron/ physics in CMS Kati Lassila-Perini HIP Activities in the.
Silicon Module Tests The modules are tested in the production labs. HIP is is participating in the rod quality tests at CERN. The plan of HIP CMS is to.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Introduction to CMSSW Framework Concepts Simulation & Reconstruction Liz Sexton-Kennedy January 10, 2008.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
June 29, 2000DOE/NSF USCMS Computing and Software Report. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DOE/NSF.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
FIMCMS, 26 May, 2008 S. Lehti HIP Charged Higgs Project Preparative Analysis for Background Measurements with Data R.Kinnunen, M. Kortelainen, S. Lehti,
CMS Computing and Core-Software Report to USCMS-AB (Building a Project Plan for CCS) USCMS AB Riverside, May 18, 2001 David Stickland, Princeton University.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Friday the 18th of May, 2001US CMS Physics J.G. Branson1 Physics in (US) CMS James G. Branson UC San Diego US CMS Collaboration Meeting Riverside CA.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
USCMS Physics, May 2001Darin Acosta1 Status Report of PRS/  D.Acosta University of Florida Current U.S. activities PRS/  Activities New PRS organization.
News and Miscellaneous UPO Jan Didier Contardo, Jeff Spalding 1 UPO Jan Workshop on Upgrade simulations in 2013 (Jan. 17/18) -ESP in.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
04/09/2007 Reconstruction of LHC events at CMS Tommaso Boccali - INFN Pisa Shahram Rahatlou - Roma University Lucia Silvestris - INFN Bari On behalf of.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
January 18, 2000DOE/NSF USCMS Computing and Software Review. HLT Studies D. Acosta1 High-Level Trigger Studies Darin Acosta University of Florida DoE/NSF.
USCMS May 2002Jim Branson 1 Physics in US CMS US CMS Annual Collaboration Meeting May 2002 FSU Jin Branson.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Computing Model José M. Hernández CIEMAT, Madrid On behalf of the CMS Collaboration XV International Conference on Computing in High Energy and Nuclear.
VI/ CERN Dec 4 CMS Software Architecture vs Hybrid Store Vincenzo Innocente CMS Week CERN, Dec
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES The Common Solutions Strategy of the Experiment Support group.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
U.Gasparini10/04/031 Toward Physics TDR Barrel Muon week, Bologna, 9-12 April CCS (Computing&Core Software) schedule Computing TDR (due.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
LHCb Computing 2015 Q3 Report Stefan Roiser LHCC Referees Meeting 1 December 2015.
CMS High Level Trigger Configuration Management
Ian Bird GDB Meeting CERN 9 September 2003
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
Commissioning of the ALICE HLT, TPC and PHOS systems
Presentation at the International Symposium on Grid Computing
ATLAS DC2 & Continuous production
SEAL Project Core Libraries and Services
Presentation transcript:

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 1 CMS Software and Computing Outline n External (Common LHC, HEP) Software n CMS Software u Organization, Framework and Software Domains u Reconstruction and Detector Performance u Trigger u Fast Simulation n CMS Computing u CMS Computing TDR u LCG Service Challenge 3 u Software/Computing Integration n Major Milestones in CMS Software and Computing M. Stavrianakou On behalf of the CMS collaboration IEEE/NSS LHC Software: Crunch Time

External (common LHC/ HEP) software

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 3 External software currently used in CMS n Persistency framework: POOL n Core Libraries and Services: SEAL n Data analysis: ROOT n Generators: GENSER n Simulation: Geant4 n HEP class libraries: CLHEP, CLHEP/HepMC n Misc. libraries: BOOST n Parsers: Xerces n Visualization: Qt, Coin3D,… n Scripting: Python, Perl n Miscellaneous: u CppUnit, RuleChecker… u Oracle, MySQL…

CMS Software

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 5 Computing, Physics, Trigger (CPT) organization HLT/DQM E. Meschi Reconstruction T. Boccali Analysis Tools L. Lista Simulation M. Stavrianakou Calibr/alignm O. Buchmuller L. Lueking Framework L. Sexton Software L.Silvestris A.Yagil Computing L. Bauerdick D.Stickland Project Manager P.Sphicas Project Office V.Innocente L.Taylor Fast Simulation P. Janot ECAL/e-  C. Seez Y. Sirois TRACKER/b-  I.Tomalin F. Palla HCAL/JetMET J.Rohlf, C.Tully Muons N. Neumeister U.Gasparini Detector-PRS D.Acosta P.Sphicas Online Selection S. Dasu C. Leonidopoulos Higgs S. Nikitenko SUSY & BSM L. Pape M. Spiropulu Standard Model J. Mnich Heavy Ions B. Wyslouch Analysis-PRS A.DeRoeck P.Sphicas Generator Tools F. Moortgat S. Slabospitsky ORCA for PTDR S. Wynhoff Integration Program S. Belforte/I. Fisk Facilities and Infrastructure N. Sinanis Operations Program L. Barone Technical Program P.Elmer/S.Lacaprara SW Devel. Tools S. Argiro’

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 6 Software Project: two major lines n Physics TDR – Current thread u Goal: Maintain and support all current SW projects, in coordination with Physics Reconstruction and Selection (PRS) groups, until April/May 2006 n Migration thread u Goal: Software Components for the Cosmic Challenge l Framework (FW) l Event Filter and integration with DAQ and FW l Geometry DB/ Calibration and Alignment infrastructure (online/HLT/Offline) l Basic Reconstruction (Local) l Additional reconstruction needed for Muon Chambers l Detector Monitoring and Data Quality Monitoring l Basic Visualization n Cosmic Challenge will use new EDM and Framework u Essential test toward the integration of the full experiment

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 7 New Software: executive summary (I) n Framework in place n Event Filter (EvF) u Well tested in old FW; migration to new FW in progress n Calibration and Alignment u Calibration objects defined in collaboration with detector/reconstruction groups u First-round implementation of conditions service complete u Interval Of Validity (IOV) object prototyped u End-to-end (E2E) test well advanced

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 8 New Software: executive summary (II) n Simulation u Most components in place; integration for production testing scheduled mid-November 2005 n Geometry service in new FW u Done, being tested n Local Reconstruction u Well underway n Data Quality Monitoring (DQM) u In excellent shape; porting to new FW in progress n Visualization u Basic visualization for cosmic data in place

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 9 CMS data Excellent debugging tool real cosmic in HCAL at SX5

Reconstruction and detector performance

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 11 Electron reconstruction Effect of loosening seed cluster E T threshold and pixel matching on electrons in H→ZZ*→2e2μ Combining Ecal E and track P measurements for best 4-vector estimates (E scale corrections for Ecal, depending on e classification, full errors in weighted means) D. Futyan C. Charlot et al. H  ZZ*  4e M H = 150 GeV

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 12 Photon reconstruction Photon isolation studies Here looking for very large rejection factors needed for H  channel, and using combination of ECAL, HCAL and tracker T. Aaltonen, K. Lassila  signal rejection Previous result M. Pieri Photon error estimation Here shown as a function of shower width (sensitive to conversion, and conversion distance from ECAL)

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 13 Muons: reconstruction on cosmics data  MB3 sector 10 recHit 12 points track segments in one chamber  =510  m (expected from cosmics (not bunched!)

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 14 Z reconstruction reconstructed M(     ) 2 identified muons 1 identified muon, p T (2 nd track) > 10 GeV signal region 1 day 2x10 33 (events/ GeV) pp  WX bb  2  X HLT+offline cuts: P T 1,2 >20,10 GeV/c bb  2  X suitable for efficiency study from real data

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time GeV barrel HB cell Jet Measurement n A. Heister et al. AN 2005/05

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 16 HCAL response at low energy n test beam 2004 results J. Damgov et al.

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 17 Cosmic data of June 05 used in track-based alignment n Proof of principle: track-based alignment with CMS software tools works n Serves development platform for alignment algorithms n Future: large cosmic runs to validate TOB rod survey measurements n  Defines pre-data taking geometry! Mean track  2 Cosmic Rack / FinnCRack corresponds to a slice of TOB, 4 fully equipped rods were used Individual modules aligned First application of track based alignment using real CMS data convergence & good alignment in 5 iterations

Fast Simulation: FAMOS

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 19 FAMOS highlights FAMOS Number of Brem photons (>500MeV) for electrons with p T = 35 GeV/c  ORCA n Tracker material tuned layer by layer, as a function of , up to 1% accuracy wrt full geometry n Pileup simulation n Decays of long-lived particles (K0s, charged pions, …) n Muons: efficiency/resolution tables, muon isolation, global muons reconstruction n Electrons/Photons n HLT: JetMET, e/  and muons n Jets: shower parametrization, calibration, reconstruction… n Root-based analysis

CMS Computing

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 21 Computing TDR (C-TDR) n C-TDR goals u Summarize, refine the Computing Model (CM), expand on analysis and role of Tier-n centres u Explain the architecture of the CMS computing system u Detail the project organization and technical planning u Form the basis for further internal discussion n C-TDR - baseline of CMS computing services and workflows u Specify “baseline” targets and development strategy l Alongside LCG TDR, describing facilities and Grid baseline services u Not an engineering “blueprint” for the computing system n LHCC review of C-TDR ( ) successfully concluded

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 22 Computing Model Principles n Structured Data + Structured Data Grid u Primary Datasets, Data Tiers — optimizes sequential data access u Tiered Grid — facilitates well-defined roles/responsibilities for centers n Data Granularity — Primary Datasets u Data always needs to be considered in its trigger context -> trigger paths u O(2PB)/yr raw data split into O(50) (40TB) trigger-determined datasets n Data Tiers — RAW, RECO, AOD u Keep RAW and RECO close together (initially at least) —> FEVT u AOD, full copy at each Tier-1, partial (even full) copies at many Tier-2 n Computing Tiers u CMS-Tier0: prompt reconstruction, close connection to online u CMS-CAF: high-priority, short-latency data processing, analysis support u Tier-1s: FEVT data custody, selection, distribution, re-reco u Tier-2s: MC production, data analysis under physicist “control”

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 23 Technical Baseline Principles n Optimize for the common case: u Optimize for read access l Most data is write-once, read-many u Optimize for bulk processing, but without limiting single user n Decouple parts of the system: u Minimize job dependencies l Allow parts of the system to change while jobs are running u Site-local information stays site-local n Know what you did to get here: u ‘Provenance tracking’ is required to understand data origin n Keep it simple! n e.g.: initially use explicit data placement u Data does not move around in response to job submission u All data is placed at a site through explicit CMS policy

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 24 Computing Organization Principles n new computing organization based on projects and programs u realize the need for (sub-)projects that need to be well-scoped u require institutional responsibility to provide effort to projects n close coordination across management team u incl. weekly meetings w/ Grids: Integration Task Force, LCG PEB/MB n up-front: address the issue of “getting computing ready” u have computing model and technical baseline in hand u use integration milestones as Computing task drivers u system continually ‘up’, DC/SC, incrementally improving capabilities n establishing integration, (analysis support) operations, facilities, programs u a paradigm shift towards focusing on making things work end- to-end u together with WLCG, EGEE/LCG, OSG, CERN-IT, T1s, T2s et al.

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 25 LCG Service Challenge 3 (SC3) n Goal: demonstrate basic functionality expected at each computing tier simultaneously u Data can be transferred from CERN to Tier-1 sites u Data can be validated and existence of data can be published u Remote applications can be executed against local data to simulate skimming and selection jobs u Data can be transferred to Tier-2 centers, while validation and publication proceeds and remote analysis applications execute against data u At the same time, simulation jobs are executed at Tier-2 centers and results transferred to Tier-1 centers for archiving n Sites have validated CMS components for transfer during the throughput phase, publication tools are tested at most sites, grid interfaces are validated at all sites

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 26 Integration program n To ensure Computing delivers functional computing systems u preparing for the major service and data challenges u integrating CMS data management and workload management systems with EGEE and OSG middleware n Covers u Production-Related Activities u Analysis-Related Activities u Database Deployment Activities u Participation in Service Challenges

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 27 CMS-LCG Integration Task Force n Goal: Satisfy CMS needs making EGEE middleware work for CMS n “Make it work”: not stop at problems, fix them working with EGEE/LCG n TF is a CMS project, lead by CMS, part of Integration Program n TF is build around specific items that CMS needs n TF will not address all items in CMS agenda, just those that can be dealt with using EGEE middleware n TF will steer EGEE/LCG development and deployment to have them take over as much as possible of CMS needs

M. Stavrianakou CMS Software and Computing IEEE/NSS: LHC Software: Crunch Time 28 CPT: major milestones in 2005/06 n Computing TDR – done n Magnet test (cosmic challenge) u ready by Dec 2005; “slice” tests in progress u start taking data in January, for ~5 months n Physics TDR – Volume I u Draft 2 ready n Physics TDR – Volume II u Detailed outline ready n Data/computing/analysis challenge in 2006 u Currently the most important goal for 2006