KEK CC - present and future - Mitsuaki NOZAKi (KEK)

Slides:



Advertisements
Similar presentations
Contact: Hirofumi Amano at Kyushu 40 Years of HPC Services In this memorable year, the.
Advertisements

CPP Staff - 30 CPP Staff - 30 FCIPT Staff - 35 IPR Staff IPR Staff ITER-India Staff ITER-India Staff Research Areas: 1.Studies.
Data oriented job submission scheme for the PHENIX user analysis in CCJ Tomoaki Nakamura, Hideto En’yo, Takashi Ichihara, Yasushi Watanabe and Satoshi.
1 A Basic R&D for an Analysis Framework Distributed on Wide Area Network Hiroshi Sakamoto International Center for Elementary Particle Physics (ICEPP),
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
Computing Coordination in Japan Takashi Sasaki Computing Research Center KEK, Inter-University Research Institute Corporation High Energy Accelerator Research.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Data GRID Activity in Japan Yoshiyuki WATASE KEK (High energy Accelerator Research Organization) Tsukuba, Japan
Data GRID deployment in HEPnet-J Takashi Sasaki Computing Research Center KEK.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The Computing System for the Belle Experiment Ichiro Adachi KEK representing the Belle DST/MC production group CHEP03, La Jolla, California, USA March.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
Workshop KEK - CC-IN2P3 KEK new Grid system 27 – 29 Oct. CC-IN2P3, Lyon, France Day2 14: :55 (40min) Koichi Murakami, KEK/CRC.
050928ICFA seminar Daegu 1 Inter-University Research Institute Corporation High Energy Accelerator Research Organization KEK Yoji Totsuka.
1 High Energy Physics (HEP) Computing HyangKyu Park Kyungpook National University Daegu, Korea 2008 Supercomputing & KREONET Workshop Ramada Hotel, JeJu,
Contact: Hirofumi Amano at Kyushu Mission 40 Years of HPC Services Though the R. I. I.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
KISTI-GSDC SITE REPORT Sang-Un Ahn, Jin Kim On the behalf of KISTI GSDC 24 March 2015 HEPiX Spring 2015 Workshop Oxford University, Oxford, UK.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
The ATLAS Experiment; to the Heart of Matter The SFU Experimental High-Energy Particle Physics Group 3 faculty members, 3 postdocs, 7-8 graduate students,
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
Randy MelenApril 14, Stanford Linear Accelerator Center Site Report April 1999 Randy Melen SLAC Computing Services/Systems HPC Team Leader.
LHC Computing, CERN, & Federated Identities
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Status of Tokyo LCG tier-2 center for atlas / H. Sakamoto / ISGC07 Status of Tokyo LCG Tier 2 Center for ATLAS Hiroshi Sakamoto International Center for.
Recent Status and Future Plans at KEK Computing Research Center Nov. 28 th, 2007 at the 3rd CC-IN2P3 - CRC-KEK Meeting KEK Computing Research Center Setsuya.
PADME Kick-Off Meeting – LNF, April 20-21, DAQ Data Rate - Preliminary estimate Tentative setup: all channels read with Fast ADC 1024 samples, 12.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
KISTI activities and plans Global experiment Science Data hub Center Jin Kim LHCOPN-ONE Workshop in Taipei1.
IHEP Computing Center Site Report Gang Chen Computing Center Institute of High Energy Physics 2011 Spring Meeting.
Computer System Replacement at KEK K. Murakami KEK/CRC.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
거대계산과학 허브 워크샵 ~13 부산대
CCIN2P3 Site Report - BNL, Oct 18, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
Evolving Architecture for Beyond the Standard Model Kihyeon CHO (KISTI) Yonsei Nuclear and Particle Workshop Yonsei University, Seoul, Korea April 29,
NDGF Site Report Mattias Wadenstein Hepix 2009 spring, Umeå , Umeå University.
PI: Kihyeon Cho & Soonwook Hwang Sponsor: Creative project by KISTI
CEPC software & computing study group report
The Beijing Tier 2: status and plans
KEKCC – KEK Central Computer System
Belle II Physics Analysis Center at TIFR
U.S. ATLAS Tier 2 Computing Center
News and update from KEK-CRC
LCG Deployment in Japan
CC - IN2P3 Site Report Hepix Spring meeting 2011 Darmstadt May 3rd
Status and Plans on GRID related activities at KEK
Atsushi Manabe and Takashi Sasaki
UK GridPP Tier-1/A Centre at CLRC
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Southwest Tier 2.
A high-performance computing facility for scientific research
Kihyeon Cho Supercomputing Center KISTI
Interoperability of Digital Repositories
High Energy Physics at UTA
High Energy Physics at UTA
Particle Physics Theory
TeraScale Supernova Initiative
Presentation transcript:

KEK CC - present and future - Mitsuaki NOZAKi (KEK)

LHC at CERN KEKB-Factory Photon Factory J-PARC in Tokai ILC Test Facility ATF STF

Lepton Quark e   Lepton CP Asymmetry Beyond Standard Physics KEK-B LHC J-PARC Power-Upgrade [Origin of Force] Higgs Particle [Origin of Mass] Quest for Birth-Evolution of Universe Quest for Unifying Matter and Force Super-KEKB International Linear Collider ( ILC ) Quark CP Asymmetry [Origin of Matter] Quest for 6 Quarks Quest for Neutrinos Scientific Activities Technology Innovation Encouraging Human Resources

From KEKB to SuperKEKB

KEK CC KEK staff – 15 researchers + 8 engineers Contractors – ~20 SE’s Budget ~20 Oku-yen/year – Central computing system : 5 Oku-yen – Network : 2 Oku-yen – B factory: 5 Oku-yen – Super-computer : 7 Oku-yen – Others : 1 Oku-yen

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 6 Rental from IBM ( Feb ~ Feb. 2012) Work Server, Computing Server –Program development, Job submission, etc. –IBM System x3550 (Xeon-QX 5460 x2, memory 16GB) –Computing Server : 84 nodes, SPECint_rate2006 –RHEL5 –Platform LSF HPC Software –CERNLIB, Geant4, CLHEP and ROOT –Monte Carlo Simulation codes Storage System –HPSS : 3PB –Disk storage : 205 TB Central Computing System : KEKcc HPSS : 3PB Tape Library (High Performance Storage System) Computing Servers HPSS Data Server/ File Servers GRID System CPU : 640 cores DISK : 200 TB TAPE : 3 PB CPU : 640 cores DISK : 200 TB TAPE : 3 PB

2008/10/21KEK site report - HEPiX fall 2008, Taipei7 B Factory Computer System The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK 7 3.5PB Tape Library 1.5PB Disk Storage Rental from NetOne (Mar ~ Feb. 2012) Computing Servers –DELL Power Edge M600 (Xeon QX 5460 x2, memory 4GB) –480 nodes : 70,000 SPECint_rate2006 –Linux(CentOS/CS) Disk Storage System – 1.5PB, 45 FileServers –HSM : 910TB, non HSM : 630TB Tape Library System: Petaserv(SONY) – 3.5PB + 68drive(SAIT1/2) + 14servers –SAIT 500GB/volume 7000 volumes –30MB/s (SAIT1) 45MB/s(SAIT2) Computing Server 480 nodes (12k SPECint2006 CPU : 4000 core DISK : 1.5 PB TAPE : 3.5 PB CPU : 4000 core DISK : 1.5 PB TAPE : 3.5 PB

The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK 8 Supercomputer System Hitachi SR11000 K1 System 16 nodes 2.15 Tflops (Theoretical peak ) 512GB memory IBM BlueGene Solution 10 racks, 1024nodes/rack 57.3 Tflops (Theoretical peak) Massive parallel system for Lattice QCD simulation HITACH SR11000K1 IBM BlueGene CPU (SR) : 57.3 TFlops Memory : 5 TB CPU (SR) : 57.3 TFlops Memory : 5 TB CPU : 2.15 TFlops Memory : 512 GB CPU : 2.15 TFlops Memory : 512 GB ジョブ使用率 A : 81 % B : 98 % ジョブ使用率 A : 81 % B : 98 %

Upgrade plan Central computer Mail, Web, … Research Belle SupercomputerNext system National projecttuningoperation

Future prospect of KEK CC Tier-0 for Belle-II (2014~ ?) – Tier-1 centers in Asia, Europe, Americas What if Super-KEKB and Super-B merges ? Tier-1 or 2 for SLHC (20xx~) ? – Needs discussion with ICEPP and ATLAS-J Continue R&D – GRID, GEANT4, GRACE, DAQ,…

PC Servers Disk Arrays Tape Robot ~270 m 2 ICEPP ATLAS Tier 2

Global link Asian accelerator network – G5 (China, India, Korea, Japan, Russia) Launched last December on the DG’s initiative Link to Europe through FxPPL (x=c,j,k,v) Why not “geographical enlargement” of FJPPL – Start with computing centers; CCIN2P3-KEK-KISTI-GRIDKA – Then invite other countries

Merci.