CMS Computing Model summary UKI Monthly Operations Meeting Olivier van der Aa.

Slides:



Advertisements
Similar presentations
Review of WLCG Tier-2 Workshop Duncan Rand Royal Holloway, University of London Brunel University.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
Introduction to CMS computing CMS for summer students 7/7/09 Oliver Gutsche, Fermilab.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
ATLAS Analysis Model. Introduction On Feb 11, 2008 the Analysis Model Forum published a report (D. Costanzo, I. Hinchliffe, S. Menke, ATL- GEN-INT )
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
WLCG/8 July 2010/MCSawley WAN area transfers and networking: a predictive model for CMS WLCG Workshop, July 7-9, 2010 Marie-Christine Sawley, ETH Zurich.
Stuart Wakefield Imperial College London1 How (and why) HEP uses the Grid.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
Zhiling Chen (IPP-ETHZ) Doktorandenseminar June, 4 th, 2009.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Introduction to CMS computing J-Term IV 8/3/09 Oliver Gutsche, Fermilab.
4/20/02APS April Meeting1 Database Replication at Remote sites in PHENIX Indrani D. Ojha Vanderbilt University (for PHENIX Collaboration)
The Computing Project Technical Design Report LHCC Meeting June 29, 2005.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
Meeting, 5/12/06 CMS T1/T2 Estimates à CMS perspective: n Part of a wider process of resource estimation n Top-down Computing.
Stefano Belforte INFN Trieste 1 CMS Simulation at Tier2 June 12, 2006 Simulation (Monte Carlo) Production for CMS Stefano Belforte WLCG-Tier2 workshop.
Claudio Grandi INFN Bologna CMS Computing Model Evolution Claudio Grandi INFN Bologna On behalf of the CMS Collaboration.
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
US-CMS T2 Centers US-CMS Tier 2 Report Patricia McBride Fermilab GDB Meeting August 31, 2007 Triumf - Vancouver.
Status report of the KLOE offline G. Venanzoni – LNF LNF Scientific Committee Frascati, 9 November 2004.
The CMS Top 5 Issues/Concerns wrt. WLCG services WLCG-MB April 3, 2007 Matthias Kasemann CERN/DESY.
Frank Wuerthwein, UCSD Update on D0 and CDF computing models and experience Frank Wuerthwein UCSD For CDF and DO collaborations October 2 nd, 2003 Many.
6 march Building the INFN Grid Proposal outline a.ghiselli,l.luminari,m.sgaravatto,c.vistoli INFN Grid meeting, milano.
David Stickland CMS Core Software and Computing
1 Andrea Sciabà CERN The commissioning of CMS computing centres in the WLCG Grid ACAT November 2008 Erice, Italy Andrea Sciabà S. Belforte, A.
04/09/2007 Reconstruction of LHC events at CMS Tommaso Boccali - INFN Pisa Shahram Rahatlou - Roma University Lucia Silvestris - INFN Bari On behalf of.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
GDB, 07/06/06 CMS Centre Roles à CMS data hierarchy: n RAW (1.5/2MB) -> RECO (0.2/0.4MB) -> AOD (50kB)-> TAG à Tier-0 role: n First-pass.
ALICE Computing Model A pictorial guide. ALICE Computing Model External T1 CERN T0 During pp run i (7 months): P2: data taking T0: first reconstruction.
LHCb Current Understanding of Italian Tier-n Centres Domenico Galli, Umberto Marconi Roma, January 23, 2001.
Computing Model José M. Hernández CIEMAT, Madrid On behalf of the CMS Collaboration XV International Conference on Computing in High Energy and Nuclear.
16 September 2014 Ian Bird; SPC1. General ALICE and LHCb detector upgrades during LS2  Plans for changing computing strategies more advanced CMS and.
Belle II Computing Fabrizio Bianchi INFN and University of Torino Meeting Belle2 Italia 17/12/2014.
1 June 11/Ian Fisk CMS Model and the Network Ian Fisk.
Oct 16, 2009T.Kurca Grilles France1 CMS Data Distribution Tibor Kurča Institut de Physique Nucléaire de Lyon Journées “Grilles France” October 16, 2009.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
CMS data access Artem Trunov. CMS site roles Tier0 –Initial reconstruction –Archive RAW + REC from first reconstruction –Analysis, detector studies, etc.
LHCb LHCb GRID SOLUTION TM Recent and planned changes to the LHCb computing model Marco Cattaneo, Philippe Charpentier, Peter Clarke, Stefan Roiser.
1-2 March 2006 P. Capiluppi INFN Tier1 for the LHC Experiments: ALICE, ATLAS, CMS, LHCb.
Claudio Grandi INFN Bologna Workshop congiunto CCR e INFNGrid 13 maggio 2009 Le strategie per l’analisi nell’esperimento CMS Claudio Grandi (INFN Bologna)
Belle II Physics Analysis Center at TIFR
Status and Prospects of The LHC Experiments Computing
Model (CMS) T2 setup for end users
CMS transferts massif Artem Trunov.
LHCb computing in Russia
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
CMS computing: model, status and plans
Off-line & GRID Computing
Artem Trunov and EKP team EPK – Uni Karlsruhe
Simulation use cases for T2 in ALICE
ALICE Computing Model in Run3
An introduction to the ATLAS Computing Model Alessandro De Salvo
ILD Ichinoseki Meeting
N. De Filippis - LLR-Ecole Polytechnique
R. Graciani for LHCb Mumbay, Feb 2006
Heavy Ion Physics Program of CMS Proposal for Offline Computing
Grid Computing in CMS: Remote Analysis & MC Production
The ATLAS Computing Model
The LHCb Computing Data Challenge DC06
Presentation transcript:

CMS Computing Model summary UKI Monthly Operations Meeting Olivier van der Aa

15 Nov 2005UKI monthly operation meeting Tier Roles in the CMS model Tier0 (CERN): Process RAW to make RECO objects hold a copy RAW and RECO objects Tier1: Re-reconstruction, Calibration, Skim and analysis. Each T1 hold a fraction of RAW and RECO (16%) Tier2: MC production,Analysis,calibration Will receive fraction of RECO/AOD ~200 TB, 0.9 MSI2K per Tier2 RAW: Detector data + Trigger data RECO: Reconstructed object (e, µ, , jets) + Reconstructed Hits AOD: Analysis Object data. Reconstructed (e, µ, , jets) + Hits required for track refit Data types Tier function 1.5MB/evt 250kB/evt 50kB/evt Data reduction

15 Nov 2005UKI monthly operation meeting Tier2 and CMS Tier2 Clients: –“Local community”: some fraction free for private use. –“CMS controlled”: Host specific analysis determined by physics groups and local interest. –Opportunistic: soaking up of spare capacity by any CMS user. Tier2 User Services: –Medium or long-term storage of AOD –Support for interactive bug finding. –Optimised access to CMS central database servers for obtaining conditions and calibration data Tier2 System Services: –Mechanism for prioritisation of resources access between competing remote and local users. –Provision of software, servers and local databases required for the operations of the CMS workload and data management services.

15 Nov 2005UKI monthly operation meeting Tier 2 nominal numbers. –CPU=0.9 MSI2K –Disk=200 TB –WAN= 1Gb/s for each site in the T2. –Data Import=5TB/day from T1 for AOD and other data replicated at T2 –Data export=1TB/day essentially for MC production shipping to T1. –Job submission frequency: 1 job/20 sec Numbers for a nominal T2 in 2008 Information source: