Computing for HEP in the Czech Republic Jiří Chudoba 9.3.2007 Institute of Physics, AS CR, Prague.

Slides:



Advertisements
Similar presentations
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware and software on Prague farms Brief statistics about running LHC experiments.
Advertisements

Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site Adam Padee ( ) Ryszard Gokieli ( ) Krzysztof.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Tier 2 Prague Institute of Physics AS CR Status and Outlook J. Chudoba, M. Elias, L. Fiala, J. Horky, T. Kouba, J. Kundrat, M. Lokajicek, J. Svec, P. Tylka.
Prague Site Report Jiří Chudoba Institute of Physics, Prague Hepix meeting, Prague.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP.
Prague TIER2 Computing Centre Evolution Equipment and Capacities NEC'2009 Varna Milos Lokajicek for Prague Tier2.
FZU Computing Centre Jan Švec Institute of Physics of the AS CR, v.v.i
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
ATLAS DC2 seen from Prague Tier2 center - some remarks Atlas sw workshop September 2004.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
Data management for ATLAS, ALICE and VOCE in the Czech Republic L.Fiala, J. Chudoba, J. Kosina, J. Krasova, M. Lokajicek, J. Svec, J. Kmunicek, D. Kouril,
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
WLCG Tier-2 site in Prague: a little bit of history, current status and future perspectives Dagmar Adamova, Jiri Chudoba, Marek Elias, Lukas Fiala, Tomas.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of Physics of the Academy of Sciences of the Czech Republic
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
CDF computing in the GRID framework in Santander
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Large Simulations using EGEE Grid for the.
5 Sept 2006GDB meeting BNL, MIlos Lokajicek Service planning and monitoring in T2 - Prague.
Computing Jiří Chudoba Institute of Physics, CAS.
13 October 2004GDB - NIKHEF M. Lokajicek1 Operational Issues in Prague Data Challenge Experience.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Ukrainian Academic Grid Initiative (UAGI) Status and outlook G. Zinovjev Bogolyubov Institute for Theoretical Physics Kiev, Ukraine.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
Report on availability of the JINR networking, computing and information infrastructure for real data taking and processing in LHC experiments Ivanov V.V.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2015/2016
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2016/2017
Prague TIER2 Site Report
The INFN TIER1 Regional Centre
HEPIX Spring 2012 Meeting Rupert Leitner Charles University, Prague
Collaboration Board Meeting
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
Presentation transcript:

Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague

Outline Groups involved Resources  Network  Hardware  People Usage Outlook

HEP Groups in CZ with Major Computing Requirements Institute of Physics, AS CR, Prague  ATLAS, ALICE, D0, H1, CALICE Institute of Nuclear Physics, AS CR, Řež u Prahy  ALICE, STAR Czech Technical University, Prague  ALICE, ATLAS, STAR, COMPASS Charles University, Faculty of Mathematics and Physics, Prague  ATLAS, NA49, COMPASS

Network Connection provided by CESNET  Czech National Research and Education Network provider  Association of legal entities formed by universities and the Academy of Sciences of the CR

CESNET2 Topology

GEANT2 planned topology

Gbps lines to -FNAL -TAIPEI -FZK Optical 1 Gbps connections with collaborating institutions in Prague Optical network infrastructure Provided by CESNET for HEP TIER2 farm GOLIAS

Hardware Resources Tier2 centre (farm GOLIAS) in the Regional Computing Centre for Particle Physics in the Institute of Physics  facility shared with non HEP projects  contributions from Institute of Nuclear Physics CESNET farm SKURUT  provided for EGEE project, used by ATLAS, D0, Auger and others Smaller local clusters  for analysis  not shared via grid

Prague T2 center GOLIAS at FZU  Heterogeneous hardware, due to irregular HW upgrades (from 2002)  140 nodes (including Blade servers), 320 cores  ~400k SpecInt2000 GOLIAS

Prague T2 center  55 TB of raw disk space; 4 disk arrays 1 TB array (RAID 5), SCSI disks 10 TB array (RAID 5), ATA disks 30 TB array (RAID 6), SATA disks 14 TB iSCSI array with SATA disks  PBSPro server, fairshare job ratios 50/19/7/23 D0/ATLAS/ALICE,STAR/others  possible usage of solid-state physics group resources  Network Connections 1 Gbps lines GEANT2 connection FNAL, p2p, CzechLight/GLIF Taipei, p2p, CzechLight/GLIF FZK Karlsruhe, p2p over GN2, from 1 September 2006  150 kW total electric power for computing, UPS, Diesel  10 Gbps router in 2007

CESNET farm skurut only a fraction for EGEE computing  30 dual nodes, used for ATLAS, AUGER, D0 and non HEP projects (VOCE)  OpenPBS  also used as a testbed for mw, for virtualisation  prototype connection to the METAcentre resources (a grid-like CESNET initiative) SKURUT

Manpower scarce resource  4 administrators for GOLIAS (not 4 FTEs) in charge of HW, OS, MW, network, monitor, support,... local administration often as a part time job for students limited user support

D0 participation in MC production and data reprocessing MC production in 2006: 17.5 mil. events generated jobs 1.27 TB copied to FNAL 5 % of total production Reprocessing 26 mil. events 2 TB

ALICE production in 2006 running services: vobox, LFC catalogue (J. Chudoba) production operation (D. Adamova) production started in June 2006 ALICE was able to use idle resources otherwise reserved for other experiments

ALICE - production in 2006

Provided by Gilbert Poulard

Provided by Gilbert Poulard

Statistics from PBS, GOLIAS alice: jobs, 7975 days = 21 years atlas: jobs, 8931 days = 24 years auger: 282 jobs, 146 days d0: jobs, days = 76 years h1: 515 jobs, 140 days star: 4806 jobs, 41 days long queue (local users): 676 jobs, 613 days total: jobs, days = 126 years

Service planning for WLCG ATLAS + ALICE CPU (kSI2000) Disk (TB) Nominal WAN (Gb/s)110 Table based on WLCG updated TDR for ATLAS and Alice and our anticipated share in the LHC experiments  no financing for LHC resources in 2007 (current farm capacity share for LHC) We submit project proposals to various grant systems in the Czech Republic Prepare bigger project proposal for CZ GRID together with CESNET  For the LHC needs  In 2010 add 3x more capacity for Czech non-HEP scientists, financed from state resources and structural funds of EU All proposals include new personnel (up to 10 new persons) No resources for LHC computing, yet Proposed plan for WLCG Grid resources in the Czech Republic

Conclusion GRID for HEP in the Czech Republic  Has good grounds to be built-up Excellent networking Existing infrastructure of specialists contributing to international GRID projects Pilot installation exists and is fully integrated into WLCG/EGEE and can be enlarged  Projects proposals for HEP GRID regularly proposing National GRID proposal (with CESNET) prepared  No adequate financial resources for LHC computing, yet