Operations and plans - Polish sites

Slides:



Advertisements
Similar presentations
Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Advertisements

DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site Adam Padee ( ) Ryszard Gokieli ( ) Krzysztof.
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Site Lightning Report: MWT2 Mark Neubauer University of Illinois at Urbana-Champaign US ATLAS Facilities UC Santa Cruz Nov 14, 2012.
Ceph Storage in OpenStack Part 2 openstack-ch,
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
Tier 3 and Computing Delhi Satyaki Bhattacharya, Kirti Ranjan CDRST, University of Delhi.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
Operations and plans - POLAND Janusz Oleniacz WUT – Warsaw University of Technology Politechnika Warszawska.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
Company LOGO “ALEXANDRU IOAN CUZA” UNIVERSITY OF IAŞI” Digital Communications Department Status of RO-16-UAIC Grid site in 2013 System manager: Pînzaru.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
Site Report --- Andrzej Olszewski CYFRONET, Kraków, Poland WLCG GridKa+T2s Workshop.
IHEP(Beijing LCG2) Site Report Fazhi.Qi, Gang Chen Computing Center,IHEP.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Development of a Tier-1 computing cluster at National Research Centre 'Kurchatov Institute' Igor Tkachenko on behalf of the NRC-KI Tier-1 team National.
北京大学计算中心 Computer Center of Peking University Building an IaaS System in Peking University Weijia Song April 23, 2012.
Operations in R ussian D ata I ntensive G rid Andrey Zarochentsev SPbSU, St. Petersburg Gleb Stiforov JINR, Dubna ALICE T1/T2 workshop in Tsukuba, Japan,
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Grid Operations in Germany T1-T2 workshop 2015 Torino, Italy Kilian Schwarz WooJin Park Christopher Jung.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
NDGF Site Report Mattias Wadenstein Hepix 2009 spring, Umeå , Umeå University.
KOLKATA Grid Kolkata Tier-2 Status and Plan Site Name :- IN-DAE-VECC-02 Gocdb Name:- IN-DAE-VECC-02 VO :- ALICE City:- KOLKATA Country :-
NIIF HPC services for research and education
Dynamic Extension of the INFN Tier-1 on external resources
Extending the farm to external sites: the INFN Tier-1 experience
WLCG IPv6 deployment strategy
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2015/2016
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2016/2017
NDGF The Distributed Tier1 Site.
Experience of Lustre at QMUL
GRID OPERATIONS IN ROMANIA
Pete Gronbech GridPP Project Manager April 2016
The Beijing Tier 2: status and plans
Helge Meinhard, CERN-IT Grid Deployment Board 04-Nov-2015
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
Grid site as a tool for data processing and data analysis
Moroccan Grid Infrastructure MaGrid
Andrea Chierici On behalf of INFN-T1 staff
LCG Deployment in Japan
Update on Plan for KISTI-GSDC
Experience of Lustre at a Tier-2 site
Clouds of JINR, University of Sofia and INRNE Join Together
The INFN TIER1 Regional Centre
Windows Server* 2016 & Intel® Technologies
UTFSM computer cluster
Статус ГРИД-кластера ИЯФ СО РАН.
Christof Hanke, HEPIX Spring Meeting 2008, CERN
Conditions Data access using FroNTier Squid cache Server
PK-CIIT Grid Operations in Pakistan
Small site approaches - Sussex
ETHZ, Zürich September 1st , 2016
COMP4442 Cloud Computing: Assignment 1
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
WLCG Tier-2 site at NCP, Status Update and Future Direction
RHUL Site Report Govind Songara, Antonio Perez,
H2020 EU PROJECT | Topic SC1-DTH | GA:
Presentation transcript:

Operations and plans - Polish sites Janusz Oleniacz Warsaw University of Technology Faculty of Physics

3 ALICE sites in Poland: Outline: ALICE sites in Poland ALICE sites in Poland – operations WUT site: Operations Technical details Plans Summary

3 ALICE sites in Poland: CYFRONET POZNAN WUT is Academic Computer Centre CYFRONET at AGH University of Science and Technology in Cracow/Kraków/ POZNAN is Poznan Supercomputing and Networking Center at Institute of Bioorganic Chemistry of the Polish Academy of Sciences WUT is

Polish ALICE sites - operations CYFRONET (ZEUS cluster also for ATLAS and LHCb) has encreased load from 300 to 800 jobs last year present in PIONIER and LCHONE keeps SE 28TB storage at 80% used

Polish ALICE sites - operations POZNAN (REEF cluster also for ATLAS and LHCb) has encreased load from 300 to 1000 jobs last year present in GEANT and peering with LHCONE: CERNlight, RU-VRF keeps SE 187TB storage at 53% used

Polish ALICE sites - operations WUT has encreased load from 20-30 to 300 jobs last week SE 12TB XRD

WUT site - operations now 300 jobs run last week from 450 CPU cores in co-operation between: Faculty of Physics and Faculty of Civil Engineering New hardware baught after consultation with CERN/ALICE main specialists (Latchezar et al. - Thanks!)

WUT site – technical details co-operation between: Faculty of Physics now CREAM CE, SE, BDII, APEL on VMs under VMware and Faculty of Civil Engineering – now Torque server and 50 WNs ( Hypertheaded, 8 VCPUs, 16GB RAM, 100GB disk) on OpenStack (Mitaka, RackSpace)

WUT site – technical details at Faculty of Civil Engineering – OpenStack with: 60 (Nova) nodes with: 2 x Intel Xeon CPU E5-2680 v3 @ 2.50GHz (12 cores, 24 logical/HT) 96 GB RAM disc SSD 480GB ethernet 10GBit/s Together it is: 120 CPUs, 1440 cores, 2880 VCPUs Storage on CEPH with 18 nodes: Intel Xeon CPU E5-2630 v3 @ 2.40GHz 64 GB RAM 2 x 300GB /system, 2 x 128GB SSD / journal, 8 x 4TB /raw data 10Gbit/s to Nova nodes and 56Gbit/s InfiniBand to storage with raw 576 TB

WUT site – network bottleneck Faculty of Physics and Faculty of Civil Engineering – now connected with 1Gpbs VLAN outside connectivity is quite low (ingress below 75 Mbps, egress 800 Mbps)

WUT site – plans (short term) In co-operation of 2 faculties – to apply for funding this year with a goal to double current IT capacities Faculty of Physics move SE to OpenStack(EOS) and Faculty of Civil Engineering –OpenStack: more than 50 WNs and HEAT orchestration for opportunistic use of free CPUs (max 2880 VCPUs with 2GB RAM/VCPU) install EOS SE with ca 100TB on Ceph device type

WUT site – plans (long term) In co-operation of 2 faculties – to apply for IPv6 (there is local pool of IPv6 addresses waiting for local division and implementation) - to apply for LHCONE network (there is local country 100G network and ICM (HYDRA cluster for CMS and LHCb) center in Warsaw got connected to LHCONE recently) Faculty of Physics Upgrade „WUT” to Centos 7/ UMD4/ ARC/ Slurm/ VAC? Tuning of network 10G/firewall/IPv6 issues and Faculty of Civil Engineering –OpenStack: upgrade of OpenStack itself increase of computing and storage resources with new funds

Summary for ALICE sites in Poland: CYFRONET More cores, stable, no news POZNAN WUT new „cloud” type setup on flexible OpenStack EOS SE under way with ca 100TB plans to improve networking (IPv6 and PIONIER/LHCONE and local 10G) plans to gain financial support in co-operation between local faculties of WUT

Thanks ! Prepared by Janusz.Oleniacz@cern.ch, oleniacz@if.pw.edu.pl Seventh Annual ALICE Tier-1/Tier-2 Workshop in  Strasbourg  https://indico.cern.ch/event/595536