Recent Status and Future Plans at KEK Computing Research Center Nov. 28 th, 2007 at the 3rd CC-IN2P3 - CRC-KEK Meeting KEK Computing Research Center Setsuya.

Slides:



Advertisements
Similar presentations
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Advertisements

S. Gadomski, "ATLAS computing in Geneva", journee de reflexion, 14 Sept ATLAS computing in Geneva Szymon Gadomski description of the hardware the.
Belle computing upgrade Ichiro Adachi 22 April 2005 Super B workshop in Hawaii.
1 J-PARC Overview Shoji Nagamiya J-PARC Center High Energy Accelerator Research Organization (KEK) Japan Atomic Energy Agency (JAEA) October 15, 2008 ATHIC2008,
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
1 EPP International Cooperation and Coordination: KEK Introduction of KEK How a new HEP project is undertaken in Japan Current HEP priorities.
Data oriented job submission scheme for the PHENIX user analysis in CCJ Tomoaki Nakamura, Hideto En’yo, Takashi Ichihara, Yasushi Watanabe and Satoshi.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
HEP GRID CHEP, KNU 11/9/2002 Youngjoon Kwon (Yonsei Univ.) 1 Belle Computing / Data Handling  What is Belle and why we need large-scale computing?
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
CCS machine development plan for post- peta scale computing and Japanese the next generation supercomputer project Mitsuhisa Sato CCS, University of Tsukuba.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
3/27/2007Grid Efforts in Belle1 Hideyuki Nakazawa (National Central University, Taiwan), Belle Collaboration, KEK.
Computing Coordination in Japan Takashi Sasaki Computing Research Center KEK, Inter-University Research Institute Corporation High Energy Accelerator Research.
Outline IT Organization SciComp Update CNI Update
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
EPICS at APS in June.2006, n.kami, KEK J-PARC Status N. Kamikubota, KEK and J-PARC Control members * Try to enhance updates in the last 1 year.
Data GRID Activity in Japan Yoshiyuki WATASE KEK (High energy Accelerator Research Organization) Tsukuba, Japan
Data GRID deployment in HEPnet-J Takashi Sasaki Computing Research Center KEK.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
Virtual Accelerator at J-PARC 3 GeV Rapid Cycling Synchrotron H. Harada*, K. Shigaki (Hiroshima University in Japan), H. Hotchi, F. Noda, H. Sako, H. Suzuki,
Scientific Computing Experimental Physics Lattice QCD Sandy Philpott May 20, 2011 IT Internal Review 12GeV Readiness.
The Computing System for the Belle Experiment Ichiro Adachi KEK representing the Belle DST/MC production group CHEP03, La Jolla, California, USA March.
Workshop KEK - CC-IN2P3 KEK new Grid system 27 – 29 Oct. CC-IN2P3, Lyon, France Day2 14: :55 (40min) Koichi Murakami, KEK/CRC.
050928ICFA seminar Daegu 1 Inter-University Research Institute Corporation High Energy Accelerator Research Organization KEK Yoji Totsuka.
HEP Data Grid in Japan Takashi Sasaki Computing Research Center KEK.
March 2008EPICS Meeting in Shanghai1 KEKB Control System Status Mar Tatsuro NAKAMURA KEKB Control Group, KEK.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
KISTI-GSDC SITE REPORT Sang-Un Ahn, Jin Kim On the behalf of KISTI GSDC 24 March 2015 HEPiX Spring 2015 Workshop Oxford University, Oxford, UK.
Prospects for Hadron Physics in Asia H. Shimizu Laboratory of Nuclear Science Tohoku University,
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Nikhef/(SARA) tier-1 data center infrastructure
John Womersley Welcome Director of Particle Physics, CCLRC International Scoping Study Meeting, RAL April 2006.
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
1 Current Status of The Control System for J-PARC Accelerator Complex Hiroshi YOSHIKAWA J-PARC Center at KEK/JAEA October 16, 2007 ICALEPCS2007 at Knoxville.
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center.
11 January 2005 High Performance Computing at NCAR Tom Bettge Deputy Director Scientific Computing Division National Center for Atmospheric Research Boulder,
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Introduction of Accelerators for Circular Colliders 高亮度 TAU-CHARM 工厂 & 先进光源, 2014/09.
PADME Kick-Off Meeting – LNF, April 20-21, DAQ Data Rate - Preliminary estimate Tentative setup: all channels read with Fast ADC 1024 samples, 12.
1 Heavy-Ion Physics at J-PARC Shoji Nagamiya RIKEN / KEK / JAEA January 20, 2016 Tokai 1) Motivation 2) J-PARC vs. FAIR.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Computer System Replacement at KEK K. Murakami KEK/CRC.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
Development Plan for Reliable Card-size IOC N. Kamikubota and S.Yamada, J-PARC/KEK May 2015 in MSU, kami, KEK/J-PARC.
CCIN2P3 Site Report - BNL, Oct 18, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
KEK CC - present and future - Mitsuaki NOZAKi (KEK)
KEK Computing Resources after Earthquake Akiya Miyamoto 30-March-2011 ILD Software WG meeting Status as of today.
PI: Kihyeon Cho & Soonwook Hwang Sponsor: Creative project by KISTI
News and update from KEK-CRC
LCG Deployment in Japan
Status and Plans on GRID related activities at KEK
Atsushi Manabe and Takashi Sasaki
Clouds of JINR, University of Sofia and INRNE Join Together
A high-performance computing facility for scientific research
NAREGI at KEK and GRID plans
Kihyeon Cho Supercomputing Center KISTI
gLite deployment and operation toward the KEK Super B factory
Interoperability of Digital Repositories
Status of SuperKEKB May 11, 2007 SuperB Workshop Masa Yamauchi KEK.
SLHC-PP kick-off meeting, CERN 9 April 2008
Presentation transcript:

Recent Status and Future Plans at KEK Computing Research Center Nov. 28 th, 2007 at the 3rd CC-IN2P3 - CRC-KEK Meeting KEK Computing Research Center Setsuya Kawabata

1.Main Projects at KEK 1.Tsukuba campus 2.J-PARC (Tokai campus) 2.Computer Facility at KEK 1.Central Information System 2.B Factory Computer System 3.LCG Deployment Plans 4.Super Computer System 3.Future Plans at KEK 1.Research Projects 2.Computing Facilities Outline The 3rd CC.IN2P3-CRC.KEK Meeting2 Current Status and Future Plans at CRC.KEK

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 3 Tsukuba Tokai 1. Main Projects at KEK J-PARC B-Factory Photon-Factory LC-Test Facility Narita Airport

The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK KEK (Tsukuba site) e + / e - Linac KEKB Photon Factory: 2.5GeV Photon Factory-AR:6.5GeV Belle Experiment Mt. Tsukuba Computing Research Center

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK KEKB e + e - Collider e + source Ares RF cavity SCC RF(HER) ARES (LER) Belle Experiment 13 countries, 57 institutes, ~400 collaborators B 0  J/  K S Observation of CPV in the B meson system Observation of CPV in the B meson system 3.5 GeV e + 8 GeV e -

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK B Factory : Belle Experiment Data accumulated so far 1.5 PB ⇒ 2 ~ 3PB in a few years Peak Luminosity 1.7×10 34 cm -2 s -1 Accumulated Luminosity : 710 fb -1 as of the end of Dec., fb -1 /day ~ 1TB/day 1fb -1 ~ 10 6 BB イベント in Winter 2007 Crab cavity successfully installed Luminosity ⇒ more than twice Current luminosity is less than the expected. ⇒ An intensive Study is going on. Integrated Luminosity (1/fb) year

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 7 J-PARC = Japan Proton Accelerator Research Complex Joint Project between KEK and JAEA 3 GeV Synchrotron (25 Hz, 1MW) Hadron Beam Facility Materials and Life Science Experimental Facility Neutrino to Super- Kamiokande 50 GeV Synchrotron (0.75 MW) Linac (330m ) 1.2. J - PARC ( Tokai )

The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK 8 Super Kamiokande Neutrino Experimental Facility Super-Kamiokande 295 km West Kamioka J-PARC J-PARC (T2K Experiment)

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK J-PARC( Materials & Life Science Facility) Super High Resolution Powder Diffractometer (SHRPD) - KEK High-intensity SANS (HI-SANS) - JAEA Neutron Reflectometer with Horizontal-Sample Geometry - KEK High-Resolution Chopper Spectrometer (HRC) - KEK Engineering Diffractomete r - JAEA Neutron Resonance Spin Echo spectrometers - KUR, Kyoto University Cold Neutron Double Chopper Spectrometer (CNDCS) - JAEA Versatile powder diffractomete r - JAEA Diffractometer for Biological X’tallography (BIX-PN) - JAEA Proton beam IBARAKI Biological Crystal Diffractometer - Ibaraki Prefecture IBARAKI Materials Design Diffractometer - Ibaraki Prefecture 4d Space Access Neutron Spectrometer(4SEASON S) -Grant-in-Aid for Specially -Promoted Research, MEXT, High-intensity Versatile Neutron Total Diffractometer - KEK Protein Dynamics Analysis Instrument (DIANA) – JAEA ■ Tentatively Approved Instruments

Tokai Campus The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK J-PARC Information System J-PARC LAN ( JLAN ) Since Oct Operation and maintenance by the collaboration between Computing Centers of KEK and JAEA Independent LAN from KEK and JAEA LANs Independent Security Policy from KEK and JAEA Tsukuba Campus J-PARC LAN Internet KEK LAN JAEA LAN Firewall JAEA’s policy J-PARC’s policy KEK’s policy

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK Computer Facility at KEK

The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 12 Rental from IBM ( Feb ~ Jan. 2009) Central Computing System : KEKcc Work Server, Computing Server Program development, Job submission, etc. IBM eServer 326 (AMD Opteron GHz Dual) Computing Server : 76 nodes Red Hat Enterprise Linux Platform LSF HPC Software CERNLIB, Geant4, CLHEP and ROOT Monte Carlo Simulation codes Storage System HPSS : 320TB ■ Disc storage : 45 TB Mail System : KEKmail PostKEK (for Research div.) MailKEK (for administration div.) ML, Anti-SPAM, anti-Virus Web Systems KEK Official system Researchers’ system, Conference system Grid System LHC data Grid : LCG System IBM eServer326: 36 nodes, xSeries 336: 10 nodes Storage Resource Broker : SRB System 2.1. Central Information System HPSS : 320TB Tape Library (High Performance Storage System) Computing Servers GRID System HPSS Data Server/ File Servers

2.2. B Factory Computer System The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK PB Tape Library 1PB Disc Storage Rental from NetOne (Mar ~ Feb. 2012) Computing Servers DELL Power Edge 1855 (Xeon3.6GHz x2, memory 1GB) 1140 nodes : ~ 4M SKI2k Linux(CentOS/CS) Disc Storage System 1PB, 42 FileServers HSM : 370TB, non HSM : 630TB Tape Library System: Petaserv(SONY) 3.5PB + 60drv + 13srv SAIT 500GB/volume 30MB/s drive Computing Server 1140 nodes (4M SKI2k)

The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK LCG Deployment plan at KEK New Computer Systems Central Information System since Feb. 20. ’06 B Factory Computer System since Mar. 23. ‘06 1 st Phase LCG and SRB for production usage are available on the Grid System in the new Central Information System. Not for public usage, but for supporting projects Under system maintenance in contract with IBM-Japan WN: 36 nodes x 2 =72 CPU Storage: Disk (2TB) + HPSS(~200TB) Supported VO: Belle, APDG, Atlas_J Service started in May nd Phase Full support in the Belle production system Services started in Feb. 2007

The 3rd CC.IN2P3-CRC.KEK MeetingCurrent Status and Future Plans at CRC.KEK Supercomputer System Rental from Hitachi (Mar ~ Dec. 2010) Large Scale Simulation For particle and nuclear physics research and accelerator-related scientific studies. Hitachi SR11000 K1 System 16 nodes 2.15 Tflops (Theoretical peak ) Large memory capacity 32GB/64 GB/node IBM BlueGene Solution 10 racks 57.3 Tflops (Theoretical peak) Massive parallel system for Lattice QCD simulation About 50 times faster than former Supercomputer system (Hitachi SR node system) HITACH SR11000K1 IBM BlueGene

2.5. Power and Cooling at KEK Comp. Research Center Max. power consumption and Cooling capacity for each machine room: Problem: Budget reduction of 1% /year for the computer resource is critical. Legacy air conditioners remain many. Less cooling efficiency, but are too expensive to replace them at a time. Electric power of 2MW consumed by computers etc. has not been crucially discussed at KEK, but will be a big problem soon. The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 16 North Building South Building Network System Central Information System Machine Room (482m 2 ) B Factory Storage Storage Room(263m 2 ) Supercomputer System Machine Room I (411m 2 ) Machine Room II (302m 2 ) B Factory Computing Servers The date of manufacture of air conditioners :1985 ~ : 1994 ~ :2003 ~ :1991 ~ : 2000 ~ :2006 ~ PlaceSystem name Power (Max) Cooling (Max) North: Machine Room I B Computing Servers 650kVA612kW North: Machine Room II Super Computer 422kVA760kW South: Machine Room Central Info. System 170kVA363kW South: Storage Room B Factory Storage 100kVA278kW ■ Conceptual view of machine rooms

Computer System B Factory Nuclear Phys. Central Photon Factory Super Computer Network KEK SecureNet JLAN(J-PARC) Research Projects PS / J-PARC KEKB (KEKB upgrade) LHC/ATLAS The 3rd CC.IN2P3-CRC.KEK Meeting Current Status and Future Plans at CRC.KEK 17 Major Projects of KEK Middle of FY2008 : Materials & Life Science Facility at J-PARC will start the experiments. Beginning of FY2009 : T2K will start the experiment. End of FY2008: B Factory Experiment will be terminated. After 3 or 4 years for the Machine upgrade, a higher luminosity B experiment will start. FY2012? Before 2008 summer : LHC will start the colliding beam experiment. Unified unified 3.1. Future Plans at KEK

Computer Systems Jan.2006 : Nuclear Phys. Computer System was unified to the Central Information System Jan.2009 : Central Information System Upgrade Photon Factory Computer System will be unified to this system This system will support the J-PARC project. Feb.2012 : The contracts of following systems will be terminated. B Factory Computer System (Mar.2006 ~ Feb.2012) Next Central Information System (Feb ~ Jan. 2012) KEK will become LHC Tier1 Center in the long run. When and How are under debate. Network Systems Jan.2009 : Current Network System will be replaced by a New one. KEK Secure Network System (including maintenance and operation) Network infrastructure Information Security system Fire wall, Intrusion Detection System, Intrusion Prevention System, etc. Jan.2010 : J-PARC Network System will be introduced Future Plans at Computing Research Center The 3rd CC.IN2P3-CRC.KEK Meeting18 Current Status and Future Plans at CRC.KEK