Download presentation
Presentation is loading. Please wait.
Published byCalvin Horn Modified over 9 years ago
1
CMS: Computing, Software & Physics Preparation DOE Annual Program Review September 26, 2007 Oliver Gutsche for the CMS Center
2
2 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Introduction CMS is scheduled to start data taking with the first collisions of LHC in July 2008 CMS is preparing its systems for recording of collision data and simulation of Monte Carlo events: –Detector (covered by K. Maeshima’s talk) –Computing –Software CMS is developing procedures to prepare to extract physics results: –Commissioning (covered by K. Maeshima’s talk) –Preparation for Physics This talk summarizes the efforts and the contributions of the Fermilab CMS center to the areas Computing, Software and Physics Preparation to the CMS program
3
3 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche CMS Computing model Distributed computing infrastructure (CERN: 20%, T1: 40%, T2: 40%) Data access location driven T1 center play prominent role as they share the custodial storage of raw & reconstructed data Fermilab is the largest T1 center of CMS, the only T1 in the Americas Network capabilities of the whole computing system play very important role CERNCERN 1 Tier 0 USAUSA UKUK ItalyItaly FranceFrance GERGER SpainSpain TaiwanTaiwan 7 Tier 1 Data recording Primary reconstructionPartial ReprocessingFirst archive copy of the raw data (cold) Share of raw & reconstructed data for custodial storage Data Reprocessing Analysis Tasks (Skimming)Da ta Serving to Tier-2 centers for analysisArchiv e Simulation From Tier-2 25-50 Tier 2 Monte Carlo Productio nPrimary Analysis Facilities
4
4 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Fermilab T1 centers serve as an extension of the experiment on-line computing –Share of raw data for custodial storage –Data reprocessing T1 centers provide access to data entrusted to them (data location driven data access) –Selection and skimming for analysis access –Data serving to T2 centers T1 centers support regionally connected T2 centers –Specific operational support responsibilities –Archival storage of simulation and important analysis products from T2 centers Fermilab’s CMS program goes beyond T1 responsibilities supporting US & international communities –Fermilab is part of the University Support Network and provides a central analysis facility (LPC-CAF) for all US and international collaborators –Fermilab’s T1 center benefits from strong support from the global computing activities of Fermilab: Strong facility support due large experience operating computing for Tevatron experiments –Processing farms, mass storage, infrastructure, etc. –Fermilab’s network group supports the T1 network needs and also national and international network connections of the US T2 and US university networks for (very important for CMS wide area data movement)
5
5 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Fermilab T1 center efforts Fermilab is delivering the single largest T1 facility for CMS (~15 FTE) and the only T1 in the Americas –Corresponds to two nominal T1 centers –Supports US CMS authors ~⅓ of CMS collaboration of ~2000 members Fermilab is either responsible or contributing to several service development activities central for CMS’ distributed computing system (~6 FTE) –The service framework for Monte Carlo simulation at T2 centers and re-processing and skimming at the T1 centers (ProdAgent) –Data Management Components and Analysis workflow on OSG (CMS Remote Analysis Builder (CRAB)) –Integration, development, and deployment efforts within Open Science Grid for CMS (European / US GRID interoperability, etc.) CMS uses the Open Science Grid to interconnect the US T1/T2/T3 and universities Fermilab contributes significantly to the overall data operations of CMS by processing central tasks (re-reprocessing, skimming, etc.) primarily at the Fermilab T1 (~2 FTE)
6
6 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Fermilab T1 facility Fermilab’s T1 facility is in the 3rd. year of a 4 year procurement period and within budget –Number of CPU doubled to ~900 nodes corresponding to 5.5 MSi2k –Disk space increased to 1.7 PByte (one of the world largest dCache installations) –Wide area network connection currently 20 GBit/s Export form FNAL, peak more than 1GB/s a day Import to FNAL, peak more than 250MB/s a day August
7
7 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Fermilab T1 facility usage Fermilab’s contributes significantly to the overall CMS computing –Major contribution to MC production (own production and archive of samples produced on US T2) –Major contribution to standard operations (re- reconstruction and skimming, etc.) User analysis contribution goes beyond T1 facility –Large user analysis activity not only on the T1 facility –LPC-CAF used extensively by Fermilab, US and international collaborators for various analysis purposes Operation and extension of facility manpower extensive –Admin staff continuously maintains the systems –Scaling issues frequently arise while increasing size –4 year ramp-up plan helps solving scaling problems in a timely manner –Strong support in the future required for successful operation successful Analysis jobs (dark green): more than 50,000 in August August successful Production jobs (dark green): more than 100,000 in August
8
8 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Software One of the highest profile Software contributions of Fermilab: the CMS software framework –Fermilab lead the reengineering efforts –Main support for core framework is provided by Fermilab –Significant contribution to simulation development from Fermilab Reengineering started 2005 and transition was completed 2006 –Standard for simulation, reconstruction and analysis –over Quarter Billion MC events produced so far –Used successfully in: previous data challenges cosmic data taking tests (limited number of tracking, muon and calorimeter detectors) and tracker integration test Physics preparation MC studies Aggressive release schedule up to start of data taking Large development and support effort of software at Fermilab (~10 FTE) : –Framework core development, persistency model, services, etc. –Distributed database system Frontier / integration into database interface CORAL –Detector simulation –Configuration language –Software development tools, build and distribution system –User support & software support for physics Very large international developer base supported by Fermilab
9
9 Start of data taking September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Future expectations –CMS software efforts are expected to increase significantly even after data taking started Current Fermilab software support and development efforts are required to continue at least at same scale to meat demands They might even have to be increased to provide sufficient support and development support for CMS
10
10 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Fermilab’s Strategy for Physics Preparation Fermilab prepares to provide local expertise in all aspects of physics analyses with CMS data: –Simulation –Reconstruction –Analysis software support Benefits from large experience of Fermilab physicists from CDF and D0 Strong connection with the detector responsibilities (as shown in talk by. k. Maeshima) : –Tracker –HCAL –Muon endcap detector Goal –Support US community in all aspects related to detector commissioning and performance and physics analyses –Provide leadership and expertise in CMS to drive development and progress
11
11 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Tracking Fermilab provides co-convener of the LPC tracking group –Support for LPC affiliates in track reconstruction –Development of the 2nd track reconstruction algorithm for CMS –Cosmic track reconstruction during several commissioning tests with all available algorithms, (for detector operation and commissioning, see talk of K. Maeshima) Tracker Commissioning –Surface commissioning of 15% of CMS silicon strip tracker from May to July 2007 at CERN –Over 5M cosmic triggers have been recorded Cosmic muon First comparisons of simulated cosmic ray events to data show good agreement results for a run with 100k events(T=-10°C) Track reconstruction in real cosmics data MC tuning using real cosmics data
12
12 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Electrons & Photons Fermilab provides co-convener of the LPC egamma Group –support for LPC affiliates in photon and electron reconstruction/identification, for purposes of commissioning, monitoring, and physics analysis Contributions to the CMS effort related to electrons and photons: –Design of the initial measurement of the cross section for inclusive Z and W production in the electron decay channel (with Bristol,Imperial,Princeton,Minnesota) –Detailed scheme for measuring online and offline electron efficiency(s) in first CMS data –Work within Egamma POG and EWK PAG to produce 2007 physics paper Similar analyses can double as monitoring tasks for ECAL and trigger commissioning, which can be supported at LHC@FNAL for on/offline DQM (see talk of K. Maeshima: Detector & Commissioning)
13
13 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Muons Fermilab provides co-convener of the LPC muon Group –support for LPC affiliates in muon reconstruction/identification Recent activities: Development and optimization of local muon pattern and track reconstruction in the CMS endcap muon system (with collaboration with Northwestern U.) –Allows for improvements of muon segment finders Development (with CERN and UCSB) of muon identification variables –Uses tracks, associated calorimeter energy in all calorimeters and associated muon chamber segments –Studies indicate very good muon hadron separation low fake rates (below 1%, turquoise histogram) at still very high muon reconstruction efficiency (>99% at 100 GeV) DF Key: E Electron simhit M Muon simhit R RecHit R RecHit on Segment S Segment ST loc. z (cm) loc. x (cm) recent study of events with muon (green hits) induced electron (blue hits) activity:
14
14 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Jets: trigger and datasets Fermilab provides co-convener of the LPC Jet/MET group –Support for LPC affiliates in JET/MET reconstruction Studies of Single Jet Triggers & Dataset Sizes –Existing triggers can be improved Recommendations for real data taking –Decrease a few of the prescales and move a few of the HLT thresholds in order to BALANCE the dataset sizes. Gives tractable size of data samples for analysis. Gives better overlap for trigger efficiency measurement Existing Triggers Proposed Triggers
15
15 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Jet energy correction Plan: jet corrections will be factorized –Correcting for each factor in a fixed sequence up to a level chosen by the user. L3 Abs: Pt Reco Jet L1 Offset L2 Rel: eta L4 EMF L5 Flavor L6 UE L7 Parton Cali Jet Barrel Jet (| |<1.3) Probe Jet (Any ) Response from P T Probe = P T Barrel Example: L2: response in η –Dijet balance uses in-situ data to measure response and corrections vs. ηDijet balance shows good agreement with MC truthBias in method is small (2-3%) and insensitive to QCD spectrumPlan: use dijet balance technique early in CMS running
16
16 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Example analysis: Dijet mass Dijet Mass –Classic resonance search analysis in rate vs. mass. Bump hunting. –Dijet is defined as the two leading jets. –Require each leading jet to be inside |η| < cut. Old Tevatron cut was |η|<1 Optimized cut is |η|<1.3 Could see dijet resonances early at LHC. –One day of running at L=10 32 cm -2 s -1 (10 pb -1 ). –Gives 4 sigma signal for a 2 TeV excited quark (Tevatron has excluded m<0.78 TeV) Analysis prominent example for profiting from CDF/D0 experience of Fermilab physicists in preparing for analysis of CMS data Dijet Resonances in Dijet Mass X q, q, g New particles, X, produced in parton-parton annihilation will decay to 2 partons (dijets). time space
17
17 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Summary & Outlook CMS is currently performing final preparations for the start of data taking in July 2008 Fermilab contributes to all areas of the CMS program, here presented for –Computing –Software –Physics Preparation Physicists and Professionals of the Fermilab CMS Center contribute significantly to both development and maintenance of hardware, services and software which is used by all of CMS
18
18 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Fermilab T1 facility Finished facility is the combined Tier-1 + LPCCAF systems –7.3 MSi2k, 2.5 PB data disk, 4.7 PB tape Ramp-up projections till 2008
19
19 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Fermilab T1 Network
20
20 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: ProdAgent The ProdAgent Infrastructure runs all the organized processing in CMS –Originally Developed for Simulated Event Production –Used in CSA06 for Reprocessing and Organized Skimming –Adopted in February 2007 to also operate the Tier-0 workflow Represents the tools that manage workflows on approximately 80% of the computing resources in CMS Use of a common tool for all organized process tasks has an increased efficiency for the experiment –Fermilab contributes ~3 FTE to development effort (main developer)
21
21 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: CRAB CMS requires equal fair share access to all CMS data for all CMS users User tool: CRAB –CMS Remote Analysis Builder 4 simple user steps –Job Creation including data discovery and job splitting –Job submission via LCG/gLite RB or Condor-G –Job status check –Job output retrieval GRID Middleware CRAB DBSDLS USER T1 centers T2 centers EGEE Resource Broker OSG Condor-G submission
22
22 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Software 1_6_0 : Opened July 2, closed July 23, deployed Sept 3 Goals are to prepare for CSA07 and Global Runs (Sep-Oct) Integrate HLT components developed for HLT exercise Use HLT trigger paths to create streamed primary datasets for testing analysis and calibration data-flows & work-flows 1_7_0 : Opened Sept 11, Close Oct 2, deploy Oct 26 Goals are to prepare for Cosmics Run Allow backward incompatible changes in geometry & data formats 1_8_0 Integrate changes resulting from CSA07 experience Continue production of MC samples required for on-going detector and physics studies 2_0_0 Integrate changes resulting from Cosmic Run experience Start production of physics MC samples 3 months before start of data-taking CMSSW Software Release Plan June July Aug Sep Oct Nov Dec Jan Feb Mar Cosmic Run CSA07 Pre CSA07 50Mevts/mth MC Production for startup 100M evts/mth GEN+SIM production with 14x DIG+RECO production with 15x
23
23 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Tracking Cosmic cosmic trigger geometry adjustable; final geometry: example run of 100k events (T=- 10°C) Performance over longer operation periods:
24
24 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Electrons & Photons Detailed scheme for measuring online and offline electron efficiency(s) in first CMS data Tag and Probe Analysis EMObject, Track or Electron w/ID TagProbeCollection TagProbeAnalyzer (to be added) Configurable Producers (Multi-dim binned) efficiency measurement ID variable distributions Including background subtraction and uncertainties CMSSW path for efficiency of any selection step(s) Track-SC matching Efficiency as a function of E T and η Applications Online Monitoring: with ~1 Hz of Z->ee at 10 32, should be possible to measure online selection ID and trigger efficiency frequently for feedback to trigger and ECAL shifters Offline Monitoring: efficiency with full calibration and reconstruction can also be sampled frequently for DQM purposes before passage to Tier2
25
25 September 26, 2007Computing, Software & Physics Preparation - Oliver Gutsche Backup: Muons Development (with CERN and UCSB) of muon identification variables –This track based inside-out MuID approach provides an alternative to outside-in (matching muon chamber tracks to tracker tracks): Extrapolate tracks trough whole detector. Associate with track: –ECAL energy in crossed cells –HCAL energy - ƒ - –outer HCAL energy - ƒ - –muon chamber segments –Use associated information to calculate a 3D Likelihood ratio for muon vs. hadron calorimeter compatibility (1 for muon, 0 for hadron, y-axis) and a weight based muon-segment compatibility (x-axis) Single muons, p T = 3 to 100 GeV/c Single pions, p T = 3 to 100 GeV/c –With basic cut as indicated by light blue line, studies indicate very good muon hadron separation:low fake rates (below 1%)at still very high muon reconstruction efficiency (>99% at 100 GeV), also in difficult low momenta regime
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.