Status report on LHC_2: ATLAS computing

Slides:



Advertisements
Similar presentations
LCG France Network Infrastructures Centre de Calcul IN2P3 June 2007
Advertisements

LCG-France Project Status Fabio Hernandez Frédérique Chollet Fairouz Malek Réunion Sites LCG-France Annecy, May
CCIN2P3 Network FJPPL 2015
Overview of LCG-France Tier-2s and Tier-3s Frédérique Chollet (IN2P3-LAPP) on behalf of the LCG-France project and Tiers representatives CMS visit to Tier-1.
Grid Computing for High Energy Physics in Japan Hiroyuki Matsunaga International Center for Elementary Particle Physics (ICEPP), The University of Tokyo.
1 A Basic R&D for an Analysis Framework Distributed on Wide Area Network Hiroshi Sakamoto International Center for Elementary Particle Physics (ICEPP),
LCG-France Tier-1 and Analysis Facility Overview Fabio Hernandez IN2P3/CNRS Computing Centre - Lyon CMS Tier-1 tour Lyon, November 30 th.
IRODS performance test and SRB system at KEK Yoshimi KEK Building data grids with iRODS 27 May 2008.
Nov. 14th KEK-IN2P3 MeetingDenis Perret-Gallix Asia-Pacific Cooperation.
G.Rahal LHC Computing Grid: CCIN2P3 role and Contribution KISTI-CCIN2P3 Workshop Ghita Rahal KISTI, December 1st, 2008.
Grid Applications for High Energy Physics and Interoperability Dominique Boutigny CC-IN2P3 June 24, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA.
Data transfer over the wide area network with a large round trip time H. Matsunaga, T. Isobe, T. Mashimo, H. Sakamoto, I. Ueda International Center for.
1 Japanese activities in LHC Takahiko Kondo, KEK May 12, 2007 The KEK-CNRS/IN2P3-CEA/DSM/DAPNIA Collaboration Meeting at KEK.
30 June Wide Area Networking Performance Challenges Olivier Martin, CERN UK DTI visit.
BNL Service Challenge 3 Status Report Xin Zhao, Zhenping Liu, Wensheng Deng, Razvan Popescu, Dantong Yu and Bruce Gibbard USATLAS Computing Facility Brookhaven.
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
Status of Tokyo LCG tier-2 center for atlas / H. Sakamoto / ISGC07 Status of Tokyo LCG Tier 2 Center for ATLAS Hiroshi Sakamoto International Center for.
Computing activities in France Dominique Boutigny CC-IN2P3 May 12, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA Restricted ECFA Meeting in Paris.
Hiroyuki Matsunaga (Some materials were provided by Go Iwai) Computing Research Center, KEK Lyon, March
CC-IN2P3 Tier-2s Cloud Frédérique Chollet (IN2P3-LAPP) on behalf of the LCG-France project and Tiers representatives ATLAS visit to Tier-1 Lyon, April.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Summary GRID and Computing Takashi Sasaki KEK Computing Research Center.
LCG-France, the infrastructure, the activities Informal meeting France-Israel November 3rd, 2009.
Grid Computing 4 th FCPPL Workshop Gang Chen & Eric Lançon.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
CC-IN2P3: A High Performance Data Center for Research Dominique Boutigny February 2011 Toward a future cooperation with Israel.
SRB at KEK Yoshimi Iida, Kohki Ishikawa KEK – CC-IN2P3 Meeting on Grids at Lyon September 11-13, 2006.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
LCG France Network Network 11/2010 Centre de Calcul de l'IN2P3/CNRS.
CCIN2P3 Network November 2007 CMS visit to Tier1 CCIN2P3.
Roman Pöschl ILD Workshop Spring – Feb Paris/France - Pre-Meeting 1 Roman Pöschl ILD Workshop – Jan Paris/France Pre-Meeting Grid Ressources.
S.Gascon-Shotkin, D. Boutigny, Bernard ILLE April 7th-9th, 20103rd FCPPL Welcome to Lyon Villeurbanne.
Alice Operations In France
France-Asia Initiative
Institut national de physique nucléaire et de physique des particules
Status of WLCG FCPPL project
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
The Beijing Tier 2: status and plans
LCG Service Challenge: Planning and Milestones
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
Grid site as a tool for data processing and data analysis
Status Report on LHC_2 : ATLAS computing
NEGST January 2006-(may 2007)-December 2008 Serge G. Petiton, CNRS/LIFL The objective of the NEGST project is to promote the collaborations of Japan and.
LHC Computing Grid Project Status
Installed Capacity Reports
LCG France Network Infrastructures
LCG Deployment in Japan
Status and Plans on GRID related activities at KEK
Data Challenge with the Grid in ATLAS
Realization of a stable network flow with high performance communication in high bandwidth-delay product network Y. Kodama, T. Kudoh, O. Tatebe, S. Sekiguchi.
The transfer performance of iRODS between CC-IN2P3 and KEK
LCG-France activities
Update on Plan for KISTI-GSDC
The CCIN2P3 and its role in EGEE/LCG
A high-performance computing facility for scientific research
CMS Computing in France
Project: COMP_01 R&D for ATLAS Grid computing
News and computing activities at CC-IN2P3
L. Serin Scientific Deputy Director IN2P3/CNRS
Organization of ATLAS computing in France
Grid Canada Testbed using HEP applications
Institut de Physique Nucléaire de Lyon
A data Grid test-bed environment in Gigabit WAN with HPSS in Japan
e-Science for High Energy Physics
Francois Le Diberder 1rst China-France on LHC and Grid in2p3.
Grid Computing 6th FCPPL Workshop
Leaders: Katsuo TOKUSHUKU (KEK) Jean-Francois GRIVAZ (LAL)
DAPNIA Site report.
The LHCb Computing Data Challenge DC06
Presentation transcript:

Status report on LHC_2: ATLAS computing Tetsuro Mashimo International Center for Elementary Particle Physics (ICEPP), The University of Tokyo on behalf of the LHC_2 project team Workshop FJPPL’07 May 9, 2007 @KEK, Japan

LHC_2 in the year 2006 Collaboration between the IN2P3 Computing Center in Lyon (CC-IN2P3) and the ‘Regional Center’ at ICEPP, the University of Tokyo Purpose: various R&D studies for WLCG (Worldwide LHC Computing Grid) In WLCG, CC-IN2P3 as a ‘Tier-1’ center and ICEPP as a ‘Tier-2’ center The Tier-2 center at ICEPP is for the ATLAS experiment only Especially important: Establish a network connection with a high bandwidth and efficient data transfer to produce physics results quickly It is challenging to fully exploit the available bandwidth due to the large latency for a long distance connection (Round Trip Time (RTT) ~ 280msec)

ATLAS Detector Construction & Installation A few PB of raw data (each year) Diameter 25 m Barrel toroid length 26 m End-cap end-wall chamber span 46 m Overall weight 7000 Tons Detector sensors 110M channels

Distributed computing for LHC

R. Jones / ATLAS

LCG-France sites Supported LHC experiments All sites also support other virtual organizations

LCG-France sites Tier-2: GRIF Tier-2: GRIF CEA/DAPNIA CEA/DAPNIA LAL LLR LPNHE IPNO Tier-2: GRIF CEA/DAPNIA LAL LPNHE Tier-3: IPHC Strasbourg Ile de France Tier-3: IPNL Nantes Tier-2: Subatech Tier-3: LAPP Clermont-Ferrand Tier-2: LPC Annecy Lyon Tier-1: CC-IN2P3 AF: CC-IN2P3 Marseille Tier-3: CPPM

ATLAS FR Cloud Tier-2: GRIF CEA/DAPNIA LAL LPNHE Tier-3: LAPP Pekin Tokyo Roumanie Ile de France Nantes Tier-3: LAPP Tier-2: LPC Annecy Tier-1: CC-IN2P3 AF: CC-IN2P3 Marseille Tier-3: CPPM

LHC_2 in the year 2006 (cont’d) Members in the project team (* leader) French group Japanese group D. Boutigny* IN2P3 T. Mashimo* ICEPP F. Malek I. Ueda G. Rahal H. Matsumoto F. Hernandez H. Matsunaga

Budget plan in the year 2006 Item Euro Support-ed by k Yen Travel 1,000 160 Nb travels 2 2,000 CNRS 320 ICEPP Per-diem 237 22.7 Nb days 10 2,370 227 Total 4,370 547

Activities in 2006 Mainly tests for data transfer In the overall ATLAS framework: `SC4’ (Service Challenge 4) Also special tests Communications mainly by e-mail (Visits in February and March 2007)

SC4 (Lyon → Tokyo) RTT (Round Trip Time) ~ 280 msec The available bandwidth limited to 1 Gbps Linux kernel 2.4, no tuning, standard LCG middleware (GridFTP) ~ 20 MB/s (15 files parallel, each 10 streams) Not satisfactory (packet loss)

Test with iperf (memory to memory) Linux kernel 2.6.17.7 Congestion control TCP Reno vs. BIC TCP Try also PSPacer 2.0.1 (from AIST, Tsukuba) Best result: BIC TCP + PSPacer Tokyo → Lyon: >800 Mbps (with 2 streams): shown below

Summary for iperf results SL(C)4 (kernel 2.6 with BIC TCP): much better in congestion control than SL3 (kernel 2.4) Software Pacer (PSPacer by AIST) in addition gives a stable and good performance 1 stream 10 streams Lyon → Tokyo 0-5 MB/s 2-20 MB/s Tokyo → Lyon 10-15 MB/s 44-60 MB/s 1 stream 2 to 8 streams Lyon → Tokyo 45 MB/s Tokyo → Lyon 70 MB/s 100 MB/s

LHC_2 in the year 2007 (cont’d) Members increased (new members in green, * leader) French group Japanese group E. Lançon* CEA T. Mashimo* ICEPP F. Malek IN2P3 I. Ueda G. Rahal H. Matsunaga F. Hernandez J. Tanaka D. Boutigny T. Kawamoto J. Schwindling S. Jézéquel

Year 2007 Not only purely technical R&Ds, but also studies for data movement in view of physicists’ point of view The available network bandwidth will increase (probably this year) A new computer system has been installed at ICEPP and more man power is soon available for technical studies More intensive R&D for this year toward the LHC start-up

Tape Robot Disk Arrays PC Servers

Budget plan in the year 2007 Item Euro Support-ed by k Yen Travel 1,000 160 Nb travels 3 3,000 CNRS 480 ICEPP Per-diem 237 22.7 Nb days 15 3,555 12 272 Nb Travels 1 CEA 5 1,185 Total 8,740 752