10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP.

Slides:



Advertisements
Similar presentations
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware and software on Prague farms Brief statistics about running LHC experiments.
Advertisements

Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
8 June 2000Atlas NCB CERN, M. Lokajicek1 Grid status in the Czech Republic zHEP Laboratories yAcademy of Sciences, Universities …. yparticipation to the.
LCG-France Project Status Fabio Hernandez Frédérique Chollet Fairouz Malek Réunion Sites LCG-France Annecy, May
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
ATLAS computing in Geneva Szymon Gadomski, NDGF meeting, September 2009 S. Gadomski, ”ATLAS computing in Geneva", NDGF, Sept 091 the Geneva ATLAS Tier-3.
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Tier 2 Prague Institute of Physics AS CR Status and Outlook J. Chudoba, M. Elias, L. Fiala, J. Horky, T. Kouba, J. Kundrat, M. Lokajicek, J. Svec, P. Tylka.
Prague Site Report Jiří Chudoba Institute of Physics, Prague Hepix meeting, Prague.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Prague TIER2 Computing Centre Evolution Equipment and Capacities NEC'2009 Varna Milos Lokajicek for Prague Tier2.
FZU Computing Centre Jan Švec Institute of Physics of the AS CR, v.v.i
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
Data management for ATLAS, ALICE and VOCE in the Czech Republic L.Fiala, J. Chudoba, J. Kosina, J. Krasova, M. Lokajicek, J. Svec, J. Kmunicek, D. Kouril,
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
WLCG Tier-2 site in Prague: a little bit of history, current status and future perspectives Dagmar Adamova, Jiri Chudoba, Marek Elias, Lukas Fiala, Tomas.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
CSCS Status Peter Kunszt Manager Swiss Grid Initiative CHIPP, 21 April, 2006.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of Physics of the Academy of Sciences of the Czech Republic
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
October 2006ICFA workshop, Cracow1 HEP grid computing in Portugal Jorge Gomes LIP Computer Centre Lisbon Laboratório de Instrumentação e Física Experimental.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Networks ∙ Services ∙ People Enzo Capone (GÉANT) LHCOPN/ONE Meeting, LBL Berkeley (USA) LHCONE Application Pierre Auger Observatory 1-2 June.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
SiGNET – Slovenian Production Grid Marko Mikuž Univ. Ljubljana & J. Stefan Institute on behalf of SiGNET team ICFA DDW’06 Kraków, 10 th October 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Large Simulations using EGEE Grid for the.
5 Sept 2006GDB meeting BNL, MIlos Lokajicek Service planning and monitoring in T2 - Prague.
Computing Jiří Chudoba Institute of Physics, CAS.
13 October 2004GDB - NIKHEF M. Lokajicek1 Operational Issues in Prague Data Challenge Experience.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Ukrainian Academic Grid Initiative (UAGI) Status and outlook G. Zinovjev Bogolyubov Institute for Theoretical Physics Kiev, Ukraine.
ICFA DDW'06 - Cracow1 (HEP) GRID Activities in Hungary Csaba Hajdu KFKI RMKI Budapest.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
LHC Computing in Korea Seyong Kim For the 1 st Korea-CERN meeting.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Activities and Perspectives at Armenian Grid site The 6th International Conference "Distributed Computing and Grid- technologies in Science and Education"
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
1 Austrian Participation in ATLAS  Innsbruck University  FH Wiener Neustadt  Austrians in ATLAS at CERN RECFA meeting, Vienna, Emmerich Kneringer.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2015/2016
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2016/2017
Grid site as a tool for data processing and data analysis
Prague TIER2 Site Report
HEPIX Spring 2012 Meeting Rupert Leitner Charles University, Prague
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
Presentation transcript:

10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP

2 Outline Czech HEP Community and Contributions to Experiments Scientific Network and Computing Grid in the Czech Republic Plans Conclusion

3 Czech HEP Community High Energy and Nuclear Physics Community – 156 people 59 physicists 22 engineers 21 technicians 54 undergraduate and Ph.D. students Main HEP Collaborating Institutions –Academy of Sciences of the Czech Republic –Charles University in Prague –Czech Technical University in Prague –Masaryk University in Brno –Brno University of Technology –Technical University of Liberec –Silesian University in Opava –… Main HEP Institutions app. 500 km

4 Contributions to HEP and NP Experiments Past experiments –DELPHI, NA4, OMEGA, UA4-2, NA45, NA57, PIXEL(RD-19), GAS Microstrips (RD-28), TILECAL (RD-34), ROSE (RD-48), RADTOL (RD- 49) Current experiments –ATLAS, ALICE, TOTEM, AUGER – D0, H1, STAR –COMPASS (NA58), DIRAC (PS212), ISOLDE/IS381, RADHARD (RD50), nTOF, NEG, MEDIPIX –HADES and CMB (GSI Darmstadt), Cyclotron U-120M in NPI (max 40 MeV protons) –ENLIGHT (hadron therapy) Detector development –Silicon pixels and strips, calorimeters, crystals, electronics Computing –EDG, EGEE, EGEEII, WLCG Theory

5 Czech Scientific Computer Network CESNET –Czech National Research and Education Network provider –Association of legal entities formed by all universities of the Czech Republic and the Academy of Sciences of the Czech Republic –Operation and development of the Czech NREN (CESNET2 network) –Research and development of advanced network technologies and applications

6 CESNET2 topology

7 GEANT2 planned topology May 2006

8 1 Gbps lines to -FNAL -TAIPEI -FZK (very new) Optical 1 Gbps connections with collaborating institutions in Prague Optical network infrastructure Provided by CESNET for HEP TIER2 farm GOLIAS

9 GRID Computing Pre – GRID experience from 90 th –METACentrum project of CESNET Interconnection of supercomputers at Czech Universities AFS, single login and common job submission, user support, application SW 400 linux processors + few special “supercomputers” GRID-like project No real national GRID project Participation to international GRID projects via CESNET METACentrum project –EDG (from 2001), EGEE, EGEEII –GRIDLAB, CoreGRID –WLCG (from 2002)

10 High Energy Physics WLCG GRID WLCG Tier 2 centre Two separate farms –CESNET_EGEE – farm SKURUT located at CESNET 30 processors, storage on GOLIAS, used for ATLAS Part of CE ROC, support of VOCE –Prague – farm GOLIAS located at FZU (Institute of Physics ASCR) SKURUT

11 Prague T2 center Prague – GOLIAS from 2004 –Heterogeneous hardware, due to irregular HW upgrades (from 2002) –total 121 nodes (including Blade servers), 250 cores –~150k SpecInt2000 GOLIAS

12 Prague T2 center –41 TB of raw disk space; 3 disk arrays 1 TB array (RAID 5) 10 TB array (RAID 5) 30 TB array (RAID 6) –Tasks 150/100 D0/LHC –Network Connections 1 Gbps lines GEANT standard infrastructure FNAL, p2p, CzechLight/GLIF Taipei, p2p, CzechLight/GLIF FZK Karlsruhe, p2p over GN2, from 1 September 2006 –150 kW total electric power for computing, UPS, Diesel

13 Personnel 4 persons to run WLCG Tier-2 –Jiri Kosina – middleware, Storage (FTS), monitoring –Tomas Kouba – middleware, monitoring –Jan Svec – basic HW, OS, storage, networking, monitoring –Lukas Fiala - basic HW, networking, web services –Jiri Chudoba – liaison to ATLAS and ALICE, running the jobs and reporting errors, service monitoring (more personnel on EGEEII)

14 Supported VOs: ATLAS, ALICE D0 STAR, AUGER, HONE Contribution to D0, ATLAS-LCG, ALICE on the level 5-9% in 2005 On graph: Alice VOBOX problem stopped Alice jobs submission (pink colour)

15 SERVICE CHALLENGE 2006 FTS tests FZK - FZU FZK – FZU, standard GEANT internet connection Transfer of 100 files, each file 1GB FZU – SARA, standard GEANT internet connection Transfer of 50 GB J. Chudoba 48 MB/s

16 Service planning ATLAS + ALICE CPU (MSI2000) Disk (TB) MSS (TB) Table based on WLCG TDR for ATLAS and Alice and our anticipated share in the LHC experiments We submit project proposals to various grant systems in the Czech Republic Prepare bigger project proposal for CZ GRID together with CESNET –For the LHC needs –In 2010 add 3x more capacity for Czech non-HEP scientists, financed from state resources and structural funds of EU All proposals include new personnel (up to 10 new persons) No resources for LHC computing, yet Proposed plan for WLCG Grid resources in the Czech Republic

17 Conclusion GRID for HEP in the Czech Republic –Has good grounds to be built-up Excellent networking Existing infrastructure of specialists contributing to international GRID projects Pilot installation exists and is fully integrated into WLCG/EGEE and can be enlarged –Projects proposals for HEP GRID regularly proposing National GRID proposal (with CESNET) prepared –No adequate financial resources for GRID infrastructure, yet