Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.

Slides:



Advertisements
Similar presentations
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware and software on Prague farms Brief statistics about running LHC experiments.
Advertisements

Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Status GridKa & ALICE T2 in Germany Kilian Schwarz GSI Darmstadt.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Tier 2 Prague Institute of Physics AS CR Status and Outlook J. Chudoba, M. Elias, L. Fiala, J. Horky, T. Kouba, J. Kundrat, M. Lokajicek, J. Svec, P. Tylka.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
Prague Site Report Jiří Chudoba Institute of Physics, Prague Hepix meeting, Prague.
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP.
Prague TIER2 Computing Centre Evolution Equipment and Capacities NEC'2009 Varna Milos Lokajicek for Prague Tier2.
FZU Computing Centre Jan Švec Institute of Physics of the AS CR, v.v.i
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
ATLAS DC2 seen from Prague Tier2 center - some remarks Atlas sw workshop September 2004.
Data management for ATLAS, ALICE and VOCE in the Czech Republic L.Fiala, J. Chudoba, J. Kosina, J. Krasova, M. Lokajicek, J. Svec, J. Kmunicek, D. Kouril,
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Martin Bly RAL Tier1/A RAL Tier1/A Report HepSysMan - July 2004 Martin Bly / Andrew Sansum.
WLCG Tier-2 site in Prague: a little bit of history, current status and future perspectives Dagmar Adamova, Jiri Chudoba, Marek Elias, Lukas Fiala, Tomas.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Romanian Tier-2 Federation One site for all: RO-07-NIPNE Mihai Ciubancan on behalf of IT Department.
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
Company LOGO “ALEXANDRU IOAN CUZA” UNIVERSITY OF IAŞI” Digital Communications Department Status of RO-16-UAIC Grid site in 2013 System manager: Pînzaru.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of Physics of the Academy of Sciences of the Czech Republic
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
Site Report --- Andrzej Olszewski CYFRONET, Kraków, Poland WLCG GridKa+T2s Workshop.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
5 Sept 2006GDB meeting BNL, MIlos Lokajicek Service planning and monitoring in T2 - Prague.
Computing Jiří Chudoba Institute of Physics, CAS.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
13 October 2004GDB - NIKHEF M. Lokajicek1 Operational Issues in Prague Data Challenge Experience.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
GridKa Cloud T1/T2 at Forschungszentrum Karlsruhe (FZK)
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
Data transfers and storage Kilian Schwarz GSI. GSI – current storage capacities vobox LCG RB/CE GSI batchfarm: ALICE cluster (67 nodes/480 cores for batch.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
ICEPP, University of Tokyo
The Beijing Tier 2: status and plans
Operations and plans - Polish sites
Prague TIER2 Site Report
LCG Deployment in Japan
Computing Board Report CHIPP Plenary Meeting
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
Presentation transcript:

Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic

, FZK 2 Outline Supported groups Supported groups Hardware Hardware Middleware and software Middleware and software Current status Current status

, FZK 3 Particle Physics in the Czech Republic Groups located at  Charles University in Prague  Czech Technical University in Prague  Institute of Physics of the Academy of Sciences of the Czech Republic  Nuclear Physics Institute of the Academy of Sciences of the Czech Republic Main Applications  Projects ATLAS, ALICE, D0, STAR, TOTEM  Groups of theoreticians  Approximate size of the community 60 scientists, 20 engineers, 20 technicians and 40 students and PhD students

, FZK 4 Hardware 2 independent computing farms –Golias located in FZU –Skurut from CESNET –Skurut from CESNET (= “academic network provider”) older hardware (29 dual nodes, PIII700MHz) offered by CESNET older hardware (29 dual nodes, PIII700MHz) offered by CESNET part used as a production farm, some nodes for tests and support for different VOs (VOCE, GILDA, CE testbed) part used as a production farm, some nodes for tests and support for different VOs (VOCE, GILDA, CE testbed) contribution to Atlas DC2 at a level of 2% of all jobs finished on LCG contribution to Atlas DC2 at a level of 2% of all jobs finished on LCG

, FZK 5 Available Resources - FZU Computer hall Computer hall –2 x 9 racks –2 air condition units –180 kW electrical power available from UPS, backed up by Diesel generator –1 Gbps optical connection to CESNET metropolitan network –direct 1 Gbps optical connection to CzechLight –shared with other FZU activities

, FZK 6 FZU Worker Nodes Worker Nodes (September 2005) –67x HP DL140, dual Intel Xeon 3.06 GHz with HT (enabled only on some nodes), 2 or 4 GB RAM, 80 GB HDD –1x dual AMD Opteron 1.6 GHz, 2 GB RAM, 40 GB HDD –24x HP LP1000r, 2xPIII1.13 GHz, 1 GB RAM, 18 GB SCSI HDD –WN connected via 1 Gbps (DL140) or 100 Mbps (LP1000r) Network components Network components –3x HP ProCurve Networking Switch 2848 (3x48 ports) –HP 4108GL ~ 30 KSI2K will be added this year

, FZK 7 Golias Farm Hardware - servers PBS server: HP DL360 – 2x Intel Xeon 2.8, 2 GB RAM PBS server: HP DL360 – 2x Intel Xeon 2.8, 2 GB RAM CE, SE, UI: HP LP1000, 2x1.13Ghz PIII, 1 GB RAM, 100 Mbps (SE should be upgraded to 1 Gbps soon) CE, SE, UI: HP LP1000, 2x1.13Ghz PIII, 1 GB RAM, 100 Mbps (SE should be upgraded to 1 Gbps soon) NFS servers NFS servers –1x HP DL145 – 2x AMD Opteron 1.6 GHz, connected disc array 30 TB (raw capacity), ATA discs –1x HP LP4100TC, 1 TB disc array, SCSI discs –1x embedded server in EasySTOR 1400RP (PIII), 10 TB, ATA discs dCache server dCache server –HP DL140 upgraded by raid controller, 2x300GB discs –not used for productions, reserved for SC3 Some other servers (www, SAM) Some other servers (www, SAM)

, FZK 8 Middleware, Batch System GOLIAS: –LCG2 (2_6_0): CE, SE, UI – SLC3 CE, SE, UI – SLC3 WNs - RH7.3 (local D0 group not yet ready for SLC3) WNs - RH7.3 (local D0 group not yet ready for SLC3) –PBSPro server not on CE CE submits jobs to the node golias (PBSPro server) CE submits jobs to the node golias (PBSPro server) local users can submit to local queues on golias local users can submit to local queues on goliasSKURUT: –LCG2 (2_6_0), OpenPBS server on CE –all nodes - SLC3

, FZK 9 Queues Separate queues for different experiments and privileged users: Separate queues for different experiments and privileged users: atlas, atlasprod, lcgatlas, lcgatlasprod alice, aliceprod, d0, d0prod, auger, star,... short, long priorities are set by some PBS parameters: priorities are set by some PBS parameters: –max_running, max_user_running, priority, node properties still not optimal in heterogeneous environment still not optimal in heterogeneous environment

, FZK 10 Jobs statistics 2005/1-6 Used CPU time (in days) per activity, for January – June 2005

, FZK 11 Simulations for ATLAS

, FZK 12 ALICE PDC – Phase2 (2004) 2004: ALICE jobs submitted to Golias via AliEn 2005: new version of AliEn is being installed

, FZK 13 Tier1 – Tier2 relations Requirements defined by experiments ATLAS and ALICE Requirements defined by experiments ATLAS and ALICE Direct network connection between FZU and GridKa will be provided by GEANT2 next year Direct network connection between FZU and GridKa will be provided by GEANT2 next year “Know-how” exchange welcomed “Know-how” exchange welcomed

, FZK 14 THE END