Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS 03-52-4297, 25 June, 2006, Dubna V.A.

Slides:



Advertisements
Similar presentations
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Advertisements

T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
EGEE is a project funded by the European Union under contract IST Application Identification and Support (NA4) Activities in the RDIG-EGEE.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2 1- Institute of Theoretical and Experimental Physics, Moscow,
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
HEP Computing Coordination in Russia V.A. Ilyin Moscow State University ICFA Workshop on Grid and Digital Divide, Daegu, 26 May 2005.
Tier 1 in Dubna for CMS: plans and prospects Korenkov Vladimir LIT, JINR AIS-GRID School 2013, April 25.
Resources and Financial Plan Sue Foffano WLCG Resource Manager C-RRB Meeting, 12 th October 2010.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
16 October 2005 Collaboration Meeting1 Computing Issues & Status L. Pinsky Computing Coordinator ALICE-USA.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
Participation of JINR in the LCG and EGEE projects V.V.Korenkov (JINR, Dubna) NEC’2005, Varna 17 September, 2005.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
LISHEP, Rio de Janeiro, 20 February 2004 Russia in LHC DCs and EDG/LCG/EGEE V.A. Ilyin Moscow State University.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA3 Activity in Russia Sergey Oleshko, PNPI,
GRID development in Russia 1) Networking for science and higher eductation 2) Grid for HEP 3) Digital Divide V. Ilyin SINP MSU.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
А.Минаенко Совещание по физике и компьютингу, 03 февраля 2010 г. НИИЯФ МГУ, Москва Текущее состояние и ближайшие перспективы компьютинга для АТЛАСа в России.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and Experimental.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
EGEE is a project funded by the European Union under contract IST The Russian Research Centre Kurchatov Institute Partner Introduction Dr.Sergey.
Institute of High Energy Physics ( ) NEC’2005 Varna, Bulgaria, September Participation of IHEP in EGEE.
RDIG (Russian Data Intensive Grid) e-Infrastructure: Status and Plans Vyacheslav Ilyin (SINP, MSU), Vladimir Korenkov (JINR, Dubna), Aleksey Soldatov (RRC.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Development of Russian Grid Segment in the frames of EU DataGRID, LCG and EGEE projects V.A.Ilyin (SINP MSU), V.V.Korenkov (JINR, Dubna) NEC’2003, Varna.
V.Ilyin, V.Gavrilov, O.Kodolova, V.Korenkov, E.Tikhonenko Meeting of Russia-CERN JWG on LHC computing CERN, March 14, 2007 RDMS CMS Computing.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and Experimental.
INFSO-RI Enabling Grids for E-sciencE NA3 activity in Russia during EGEE project Elena Slabospitskaya NA3 Manager for Russia Varna,
V.A. Ilyin, ICFA DDW’06, Cracow, 11 October 2006 Networking and Grid in Russia V.A. Ilyin DDW’06, Cracow 11 October 2006.
Procedure to follow for proposed new Tier 1 sites Ian Bird CERN, 27 th March 2012.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Russia-CERN Joint Working Group on LHC Computing Russia-CERN Joint Working Group on LHC Computing, 19 March, 2004, CERN V.A. Ilyin 1.Some about JWGC 2.Russia.
INFSO-RI Enabling Grids for E-sciencE RDIG - Russia in EGEE Viatcheslav Ilyin RDIG Consortium Director, EGEE PMB SINP MSU (48),
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
JINR WLCG Tier 1 for CMS CICC comprises 2582 Core Disk storage capacity 1800 TB Availability and Reliability = 99% 49% 44% JINR (Dubna)End of.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Participation of JINR in CERN- INTAS project ( ) Korenkov V., Mitcin V., Nikonov E., Oleynik D., Pose V., Tikhonenko E. 19 march 2004.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
V.Ilyin, V.Gavrilov, O.Kodolova, V.Korenkov, E.Tikhonenko Meeting of Russia-CERN JWG on LHC computing CERN, September 27, 2006 RDMS CMS Computing.
Eygene Ryabinkin, on behalf of KI and JINR Grid teams Russian Tier-1 status report May 9th 2014, WLCG Overview Board meeting.
V. Ilyin, Russia – EU, Russia participation in EGEE stable core infrastructure - new applications/new resources/new.
14/03/2007A.Minaenko1 ATLAS computing in Russia A.Minaenko Institute for High Energy Physics, Protvino JWGC meeting 14/03/07.
Development of a Tier-1 computing cluster at National Research Centre 'Kurchatov Institute' Igor Tkachenko on behalf of the NRC-KI Tier-1 team National.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
PARTICLE PHYSICS IN RUSSIA R-ECFA Meeting in Moscow, 9-10 October, 2009V.Matveev Participating Institutes and Universities Participating Institutes and.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Operations in R ussian D ata I ntensive G rid Andrey Zarochentsev SPbSU, St. Petersburg Gleb Stiforov JINR, Dubna ALICE T1/T2 workshop in Tsukuba, Japan,
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
LIT participation LIT participation Ivanov V.V. Laboratory of Information Technologies Meeting on proposal of the setup preparation for external beams.
Report on availability of the JINR networking, computing and information infrastructure for real data taking and processing in LHC experiments Ivanov V.V.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
A Dutch LHC Tier-1 Facility
Russian Regional Center for LHC Data Analysis
LHCb computing in Russia
RDIG for ALICE today and in future
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A. Ilyin

RuTier2 Cluster Conception: Cluster of institutional computing centers with Tier2 functionality operating for all four experiments - ALICE, ATLAS, CMS and LHCb Basic functions: analysis; simulations; users data support plus some Tier1 functions Participating institutes: Moscow ITEP, SINP MSU, RRC KI, LPI, MEPhI… Moscow region JINR, IHEP, INR RAS St.Petersburg PNPI, SPbSU Novosibirsk BINP …

RuTier2 status – WLCG MoU Financing Agencies: Federal Agency on Science and Innovations (FASI) Joint Institute for Nuclear Research (JINR) Tier2 Facilities to install in Russia for ALICE, ATLAS, CMS and LHCb Russia and JINR representatives in C-RRB: Yu.F. Kozlov (FASI) and V.I. Savrin (SINP MSU) for Russia A.N. Sisakian for JINR Representatives in WLCG Collaboration Board: V.A. Ilyin (SINP MSU), alternative V.V. Korenkov (JINR) WLCG MoU has been delivered to FASI in February  official approval in Russia in progress (now to agree with Ministry of Finance)  relevant Annexes are prepared (for 2006 corrections are coming) A6.4 (Computing Capacities – CPU, Disk, Tape, WAN), A6.5 (Russia as one of the WLCG Operations Centers), A6.6 (manpower contribution to common WLCG software)

RuTier2 planning CPUDISK (usable)TAPE ActiveTAPE shelved +/yearin use +/yearin use +/yearin use +/yearin use KSI2K TB Preliminary summary computing capacities year by year for computing facilities (worked out in the beginning 2005):

RuTier2 to the LCG start 2006:  FASI budget for equipment about 1.7 MEuro (not confirmed yet) about 30% smaller than requested;  JINR budget is not known yet;  plus additional money from internal sources of participating institutes (SINP MSU, RRC KI, PNPI, ITEP and IHEP, …) The equipment status for this year: the budget to be fixed/known to July-August 2006; to install in autumn 2006, available for experiments in the end of 2006 already clear that: no tapes, 1500 KSI2K CPU and 600 TB Disk (25% reduction) could be further reduction … : budget planning – about 2 MEuro per year for equipment Time milestones for the equipment installation: Understanding to the spring 2006: (3-5%)40%60%

RuTier2 in the World-Wide Grid RuTier2 Computing Facilities is operated by Russian Data-Intensive Grid (RDIG), we are creating as Russian segment of the European grid infrastructure EGEE RuTier2 sites (institutes) are RDIG-EGEE Resource Centers Basic grid services are provided by RRC KI, SINP MSU, and JINR Operational functions are provided by IHEP, ITEP, PNPI and JINR Regional Certificate Authority and security are supported by RRC KI User support (Call Center, link to GGUS in FZK) - ITEP RDIG budget (about 1 MEuro per year) : by EU FP6 EGEE (EGEE-II) ~ 50% by FASI (two grid technological projects) and by Rosatom ~ 50% : EGEE-II Contract has been signed recently by EU FP6 FASI and Rosatom budget is under constructive approval Final draft of WLCG MoU:

RuTier2: contribution to LCG common software Tasks: 1.Contributions of Experiments to ARDA 2.Testing of new MW (SA3 activity, partly by CERN-INTAS) 3.Development of new MW (basically within new CERN-INTAS) 4.CASTOR - development of massive data storage software 5.PH/GENSER – library of MC event generators grid enabled 6.PH/MCDB – MC events data bases grid enabled Contribution to the development of grid middleware and application software for common use by WLCG and Experiments. LCG 1 st Phase contribution by Russia and JINR: 3 FTE WLCG MoU: Russia 2 FTE, JINR 1 FTE. Visiting budget on 2006-…: Russia 2 FTE approved, JINR 1 FTE approved.

International Connectivity International connectivity for Russian science are based today on 622 Mbps link to GEANT2 (2.5 Gbps from autumn 2006) Moscow (RASNet) – Frankfurt (GEANT2) 5~10~20 Mbyte/s achieved for LCG data transfer in first experiments within Service Challenge activity (SINP, JINR, ITEP) Another channel, within link 2.5 Gigabit/s Moscow - St- Perersburg – Stockholm operated by RUNNet and then to Amsterdam (SURFNet) operated by RBNet (GLORIAD) is available for us too. Now – to test these links for SC4 needs (started with Kors Bos) Connectivity with USA, China, Japan and Korea LCG partners through the GLORIAD: 622 Mbps Chicago-Amsterdam-St-Petersburg-Moscow 155 Mbps Moscow – Novosibirsk – Khabarovsk – Beijing Plans: Mbps – 1 Gbps, Gbps

GÉANT2 Topology (Oct. 2005) November 2005: GEANT2 Point-of-Presence opened in Moscow 2x622++ Mbps

Moscow 1 Gbps (ITEP, RRC KI, SINP MSU, …LPI, MEPhI), IHEP (Protvino) 100 Mbps fiber-optic ( plans to have 1 Gigabit/s ) JINR (Dubna) 1 Gbps f/o (from December 2005 ) PNPI (Gatchina) 1 Gbps f/o for LCG ( 2 Mbps commodity Internet) BINP (Novosibirsk) Mbps (GLORIAD++) INR RAS (Troitsk) 10 Mbps commodity Internet, new f/o project to start! SPbSU (S-Peterburg) 1 Gbps (?) REGIONAL CONNECTIVITY Our pragmatic goal to 2007:  all RuTier2 sites to have at least 100 Mbps f/o dedicated for network provision of RDIG users,  1 Gbps dedicated connectivity between basic RDIG sites and 1 Gbps connectivity to EGEE via GEANT2/GLORIAD. 1 Gbps f/o

Today Russia LHC experiments work (or planning) with T1s: ALICE - FZK ATLAS - SARA CMS - FZK (CERN?) LHCb - CERN

ARDA+ALICE in 2006 (Russia) At this time we have : VO boxes Appl. software at VO boxes ITEP (Moscow) + IHEP (Protvino) INR (Troitsk) JINR (Dubna) + KI (Moscow) SPtSU (S.Petersburg) Under installation: at PNPI (Gatchina) and SINP (Moscow)

Tier2s resources available in (100%) 913 (105%) 4232 (134%) Total (100%) 25 (100%) Slovakia (100%) 198 (100%) Polish T2* 1010 (100%) 132 (100%) U. Muenster 1030 (100%) 100 (100%) GSI (184%) 130 (146%) French T (6%) 240 (48%) RDIG 1014 (100%) 60 (100%) FZU Prague 25 (100%) 21 (54%) 513 (285%) USA BW to CERN/T1 (Gb/s) Tape (TB) Disk (TB) CPU (MKSI2K) Site Russia ~ 5%

What we can do RAW 5 ESD 68231,5003 MTotal RAW 2 ESD 51172,0001 M1 M PbPb RAW 3 ESD 1859,5002 M100 M pp BW [MB/s] Data [TB]Duration [days] CPU work [CPU/days] Number of jobs Number of events Assuming 85% CPU efficiency Russia ~ 5%

ATLAS in Russia 8 institutes: ITEP, LPI, MEPhI, SINP (all Moscow), BINP (Novosibirsk), IHEP (Protvino), PNPI (Gatchina) 5 of them have LCG2 farms with about 340 CPUs in total At the computing/physics meeting in Protvino ( ) all 8 institutes expressed interest in deploying Russia/ATLAS Tier2 resources

CMS sw installed at RuTier2 LCG-2 sites IHEP: VO-cms-slc3_ia32_gcc323 INR: VO-cms-OSCAR_3_6_5_SLC3_dar, VO-cms-ORCA_8_7_1_SLC3_dar, VO-cms-slc3_ia32_gcc323, VO-cms-ORCA_8_10_1 ; VO-cms-CMKIN_4_4_0_dar ITEP: VO-cms-CMKIN_4_1_0_dar; VO-cms-CMKIN_4_2_0_dar; VO-cms-CMKIN_4_4_0_dar; VO-cms-PU-mu_Hit3653_g133, VO-cms-OSCAR_3_6_5_SLC3_dar; VO-cms-ORCA_8_7_1_SLC3_dar; VO-cms-slc3_ia32_gcc323; VO-cms-ORCA_8_7_5; VO-cms-COBRA_8_5_0 JINR: VO-cms-CMKIN_4_1_0_dar; VO-cms-CMKIN_4_2_0_dar; ; VO-cms-CMKIN_4_4_0_dar; VO-cms-OSCAR_3_6_5_SLC3_dar, VO-cms-ORCA_8_7_1_SLC3_dar; VO-cms-CMKIN_4_4_0_dar; VO-cms-ORCA_8_4_0; VO-cms-COBRA_8_5_0; VO-cms-ORCA_8_7_5; VO-cms-slc3_ia32_gcc323 RRC KI: VO-cms-CMKIN_4_2_0_dar; VO-cms-OSCAR_3_6_5_SLC3_dar; VO-cms-ORCA_8_7_1_SLC3_dar ; VO-cms-slc3_ia32_gcc323; VO-cms-ORCA_8_7_4 SINP MSU: VO-cms-CMKIN_4_4_0_dar; VO-cms-OSCAR_3_6_5_SLC3_dar, VO-cms-ORCA_8_7_1_SLC3_dar, VO-cms-PU-mu_Hit3653_g133; VO-cms-ORCA_8_7_5; VO-cms-slc3_ia32_gcc323; VO-cms-COBRA_8_5_0;

Usage of CPU resources at Russian Tier2 during October, 2005 – March, 2006 CMS jobs at Russian Tier2 sites (October, 2005 – March, 2006): PNPI – 30%, ITEP – 27%, JINR - 15%, SINP MSU – 13 %, INR – 9%, IHEP – 5%, RRC KI - 1%

Current status of LHCb in Russia Russian distributed Tier-2 cluster permanently participates in LHCb activities (~35% of CPU in Russia) Computing centers: IHEP (Protvino), INR (Troitsk), ITEP (Moscow), PNPI (St.Petersburg), SINP MSU (Moscow), JINR (Dubna) Massive MC production (Data Challenges) – became a routine task, going on with a minimal intervention from site managers (via LCG resources or in pure DIRAC mode)