NAREGI at KEK and GRID plans

Slides:



Advertisements
Similar presentations
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Advertisements

S.Chechelnitskiy / SFU Simon Fraser Running CE and SE in a XEN virtualized environment S.Chechelnitskiy Simon Fraser University CHEP 2007 September 6 th.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
Grid Computing for High Energy Physics in Japan Hiroyuki Matsunaga International Center for Elementary Particle Physics (ICEPP), The University of Tokyo.
NAREGI WP4 (Data Grid Environment) Hideo Matsuda Osaka University.
Computing Coordination in Japan Takashi Sasaki Computing Research Center KEK, Inter-University Research Institute Corporation High Energy Accelerator Research.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Data GRID Activity in Japan Yoshiyuki WATASE KEK (High energy Accelerator Research Organization) Tsukuba, Japan
Data GRID deployment in HEPnet-J Takashi Sasaki Computing Research Center KEK.
H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Current Status and Recent Activities on Grid at KEK Go Iwai, KEK/CRC On behalf of KEK Data Grid.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
1 ILDG Status in Japan  Lattice QCD Archive(LQA) a gateway to ILDG Japan Grid  HEPNet-J/sc an infrastructure for Japan Lattice QCD Grid A. Ukawa Center.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
20/09/2006LCG AA 2006 Review1 Committee feedback to SPI.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK High Availability iRODS System (HAIRS) Yutaka Kawai, KEK Adil Hasan, ULiv December 2nd, 20091Interoperability.
GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center.
© 2008 Open Grid Forum File Catalog Development in Japan e-Science Project GFS-WG, OGF24 Singapore Hideo Matsuda Osaka University.
NOVA A Networked Object-Based EnVironment for Analysis “Framework Components for Distributed Computing” Pavel Nevski, Sasha Vanyashin, Torre Wenaus US.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
1 Grid Activity Summary » Grid Testbed » CFD Application » Virtualization » Information Grid » Grid CA.
EMI INFSO-RI Accounting John Gordon (STFC) APEL PT Leader.
Easy Access to Grid infrastructures Dr. Harald Kornmayer (NEC Laboratories Europe) Dr. Mathias Stuempert (KIT-SCC, Karlsruhe) EGEE User Forum 2008 Clermont-Ferrand,
Tier3 monitoring. Initial issues. Danila Oleynik. Artem Petrosyan. JINR.
SAMPLE IMAGE gLite on the Market – Why and How ir Works 4 th EGEE User Forum Thursday, 05 March 2009 Le Ciminiere, Catania, Sicily, Italy Gaël Youinou.
The GridPP DIRAC project DIRAC for non-LHC communities.
KEK GRID CA Takashi Sasaki Computing Research Center KEK.
KEK GRID for ILC Experiments Akiya Miyamoto, Go Iwai, Katsumasa Ikematsu KEK LCWS March 2010.
RENKEI:UGI Takashi Sasaki. Project history The RENKEI project led by Prof. Ken Miura of NII is funded by MEXT during JFY The goal of the project.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
G. Russo, D. Del Prete, S. Pardi Kick Off Meeting - Isola d'Elba, 2011 May 29th–June 01th A proposal for distributed computing monitoring for SuperB G.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Grid Interoperability and Data Management KEK-CRC & CC-IN2P3 Yonny CARDENAS JFPPL09 Workshop, Tsukuba, May 2009.
CERN Openlab Openlab II virtualization developments Havard Bjerke.
Jean-Philippe Baud, IT-GD, CERN November 2007
Grid and Cloud Computing
Grid2Win: Porting of gLite middleware to Windows platform
Grid2Win Porting of gLite middleware to Windows XP platform
Regional Operations Centres Core infrastructure Centres
U.S. ATLAS Grid Production Experience
Belle II Physics Analysis Center at TIFR
News and update from KEK-CRC
LCG Deployment in Japan
Status and Plans on GRID related activities at KEK
EMI Interoperability Activities
Grid related projects CERN openlab LCG EDG F.Fluckiger
Grid2Win: Porting of gLite middleware to Windows XP platform
Grid2Win: Porting of gLite middleware to Windows XP platform
EGEE support for HEP and other applications
Readiness of ATLAS Computing - A personal view
Interoperability & Standards
LCG middleware and LHC experiments ARDA project
Grid2Win: Porting of gLite middleware to Windows XP platform
GSAF Grid Storage Access Framework
Sajitha Naduvil-vadukootu
Interoperability of Digital Repositories
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
Status of SuperKEKB May 11, 2007 SuperB Workshop Masa Yamauchi KEK.
Grid related activities at KEK
Status of Grids for HEP and HENP
EPICS status and future at KEK Control Group / Commissioning Group e-/e+ Injector Linac, KEK, Japan Masanori Satoh.
gLite The EGEE Middleware Distribution
Building an Elastic Batch System with Private and Public Clouds
Presentation transcript:

NAREGI at KEK and GRID plans Takashi Sasaki KEK Computing Research Center

NAREGI status Beta-2 was released in October Packaged by rpm GRID interoperability package based on GIN is included We have completed the set up of the environment recently We are going to start the test soon

Future of NAREGI NAREGI version 1 will be released in the Spring of 2008 NAREGI will be a sub project of the 10 peta flops computer project http://www.nsc.riken.jp/index-eng.html They expect the budget cut, anyway Future plan is not clear still NII will operate NAREGI GOC as a part of their CSI

GRID interoperability GIN realized file transfer, mutual job submission and information exchange File catalogue is the issue still We have started to work with Osami Tatebe of Tsukuba University

Resource Namespace Service Specification Standardized in OGF http://www.ogf.org/documents/GFD.101.pdf Middleware independent No globus is required Two implementations are going on U. of Tsukuba http://www.ogf.org/OGF21/materials/957/OGF21%20RNS.pdf University of Virginia

RNS

Key issues How to synchronize with local file catalogues, such as, LFC? NAREGI has no built-in file catalogue and RNS will be their local catalogue, too Performance and robustness should be tested well

J-PARC New facility will be commissioned in 2008 Peta-bytes of data a year are expected HENP, material and biochemical science

KEK and J-PARC 60km Tokai Tsukuba Narita Airport J-PARC B-Factory Photon-Factory LC-Test Facility Tsukuba Narita Airport

J-PARC computing Data acquisition will be don at Tokai Only temporally storage Data will be sent to KEK and stored Network bandwidth How to automate ? How to distribute outside? SRB/iRODs LCG/NAREGI

Things to be explored Integration of the KEKB computer system with GRID According to the users preference Mass storage HPPS or something else? Virtual machine VMware or XEN? Load balancing File transfer between Tokai and Tsukuba

Future direction gLite and NAREGI will be operated at the same time Still worker nodes should be shared Virtual machine will be used SRB/iRODs would be a solution for smaller groups The same file space should be shared among local, SRB/iRODs, gLite and NAREGI

Plan NAREGI File catalogue System replacement More tests on beta-2 GRID interoperability File catalogue Collaboration with U. of Tsukuba System replacement Jan. 2009 Central computer system GRID based system 2012 Central computer system also KEKB system Unification of the systems to be considered

Application support should be provided Not only HENP, but material and biochemical/life science Hadron therapy simulation University support will be continued and extended if funded Nagoya, Kobe, Tohoku, Tsukuba and Hiroshima-IT are getting our support They have the complete LCG installation Remote installation and support Find a new budget resource to hire new people