H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Current Status and Plan on Grid at KEK/CRC Go Iwai, KEK/CRC On behalf of KEK Data Grid Team Links.

Slides:



Advertisements
Similar presentations
Marco Verlato, INFN 23 March, 2011 ISGC2011/OGF31, Taipei,Taiwan Interoperability solutions in India 1.
Advertisements

INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
Status Report: JLDG ( T. Yoshie for JLDG) AGENDA 1. Current Status of JLDG 2. Reconfiguration/Extension Plan 3. Funding.
Hungrid A Possible Distributed Computing Platform for Hungarian Fusion Research Szabolcs Hernáth MTA KFKI RMKI EFDA RP Workshop.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Grid Computing for High Energy Physics in Japan Hiroyuki Matsunaga International Center for Elementary Particle Physics (ICEPP), The University of Tokyo.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
3/27/2007Grid Efforts in Belle1 Hideyuki Nakazawa (National Central University, Taiwan), Belle Collaboration, KEK.
Computing Coordination in Japan Takashi Sasaki Computing Research Center KEK, Inter-University Research Institute Corporation High Energy Accelerator Research.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
Enabling Grids for E-sciencE ENEA and the EGEE project gLite and interoperability Andrea Santoro, Carlo Sciò Enea Frascati, 22 November.
H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Current Status and Recent Activities on Grid at KEK Go Iwai, KEK/CRC On behalf of KEK Data Grid.
11/30/2007 Overview of operations at CC-IN2P3 Exploitation team Reported by Philippe Olivero.
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
PRAGMA 17 – PRAGMA 18 Resources Group. PRAGMA Grid 28 institutions in 17 countries/regions, 22 compute sites (+ 7 site in preparation) UZH Switzerland.
Workshop KEK - CC-IN2P3 KEK new Grid system 27 – 29 Oct. CC-IN2P3, Lyon, France Day2 14: :55 (40min) Koichi Murakami, KEK/CRC.
© 2006 Open Grid Forum Enabling Pervasive Grids The OGF GIN Effort Erwin Laure GIN-CG co-chair, EGEE Technical Director
Site Report from KEK, Japan JP-KEK-CRC-01 and JP-KEK-CRC-02 Go Iwai, KEK/CRC Grid Operations Workshop – 2007 Kungliga Tekniska högskolan, Stockholm, Sweden.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
INFSO-RI Enabling Grids for E-sciencE SA1 and gLite: Test, Certification and Pre-production Nick Thackray SA1, CERN.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Next steps with EGEE EGEE training community.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America EELA Infrastructure (WP2) Roberto Barbera.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK High Availability iRODS System (HAIRS) Yutaka Kawai, KEK Adil Hasan, ULiv December 2nd, 20091Interoperability.
Grid DESY Andreas Gellrich DESY EGEE ROC DECH Meeting FZ Karlsruhe, 22./
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
GRID Deployment Status and Plan at KEK ISGC2007 Takashi Sasaki KEK Computing Research Center.
© 2008 Open Grid Forum File Catalog Development in Japan e-Science Project GFS-WG, OGF24 Singapore Hideo Matsuda Osaka University.
Computing Research Center, High Energy Accelerator Organization (KEK) Site Status Report Go Iwai, KEK/CRC, Japan WLCG Tier-2 Workshop Dec. 1 ~ 4, 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
Easy Access to Grid infrastructures Dr. Harald Kornmayer (NEC Laboratories Europe) Dr. Mathias Stuempert (KIT-SCC, Karlsruhe) EGEE User Forum 2008 Clermont-Ferrand,
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Deliverable DSA1.4 Jules Wolfrat ARM-9 –
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
KEK GRID for ILC Experiments Akiya Miyamoto, Go Iwai, Katsumasa Ikematsu KEK LCWS March 2010.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Operations Automation Team Kickoff Meeting.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
INFSO-RI Enabling Grids for E-sciencE gLite Test and Certification Effort Nick Thackray CERN.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Operations: Evolution of the Role of.
RENKEI:UGI Takashi Sasaki. Project history The RENKEI project led by Prof. Ken Miura of NII is funded by MEXT during JFY The goal of the project.
Computing Research Center, High Energy Accelerator Organization (KEK) Gird Deployment at KEK Go Iwai, Yoshimi Iida, Setsuya Kawabata, Takashi Sasaki and.
Status of GSDC, KISTI Sang-Un Ahn, for the GSDC Tier-1 Team
II EGEE conference Den Haag November, ROC-CIC status in Italy
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Workshop KEK - CC-IN2P3 LCG update and plan at KEK Go Iwai, KEK/CRC 27 – 29 Oct. CC-IN2P3, Lyon, France Day1 15: :05 (40min)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks CYFRONET site report Marcin Radecki CYFRONET.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Pledged and delivered resources to ALICE Grid computing in Germany Kilian Schwarz GSI Darmstadt ALICE Offline Week.
GRID & Parallel Processing Koichi Murakami11 th Geant4 Collaboration Workshop / LIP - Lisboa (10-14/Oct./2006) 1 GRID-related activity in Japan Go Iwai,
Grid Interoperability and Data Management KEK-CRC & CC-IN2P3 Yonny CARDENAS JFPPL09 Workshop, Tsukuba, May 2009.
LCG Deployment in Japan
Long-term Grid Sustainability
Introduction to Data Management in EGI
Update on Plan for KISTI-GSDC
EGEE support for HEP and other applications
NAREGI at KEK and GRID plans
gLite deployment and operation toward the KEK Super B factory
Current Grid System in Belle
Grid related activities at KEK
Presentation transcript:

H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Current Status and Plan on Grid at KEK/CRC Go Iwai, KEK/CRC On behalf of KEK Data Grid Team Links to SAGA is additionally listed in reference slide (end of presentation file) Institutes list is a bit revised (in slide: Institutes)

Outline ▸Introduction ▹Deployment ▹VO specific operational statistics ▸Recent activities ▹Unified user-interfaces using SAGA * ▸Summary March 17, 20092nd Open Meeting of the SuperKEKB KEK2 * SAGA: A Simple API for Grid Applications

H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Introduction ▸Our role ▸Deployment status ▸VO scale ▹Resources ▹Institutes ▹Members ▸Ops stats ▹# of jobs ▹CPU consumption 3

Tohoku Univ. KEK Univ. of Tsukuba Nagoya Univ. Kobe Univ. Hiroshima IT Introduction ▸Major HEP projects: ▹Belle, J-PARC, ATLASongoing projects ▹ILC, Super-Bellefuture projects ▸Also covering ▹Material science, bio-chemistry and so on using synchrotron light and neutron source ▹RTtech. transfer ▸We have a role to support university groups in these fields. ▹including Grid deployment/operation. March 17, 20092nd Open Meeting of the SuperKEKB KEK4

KEK’s Contribution in EGEE-III ▸TSA1.1: Grid Management ▹Interoperability and collaboration ▸TSA1.2: Grid operations and support ▹1st line support for operations problems ▹Middleware deployment and support ▪a) Coordination of middleware deployment and support for problems. ▪b) Regional certification of middleware releases if needed (outside of PPS and SA3 involvement). This is anticipated to be very rare and will require specific justification. March 17, 20092nd Open Meeting of the SuperKEKB KEK5

SINET Logical Map Focused on Network Connection for Grid March 17, 20092nd Open Meeting of the SuperKEKB KEK6 CA VOMS HPSS-DSI SRB-DSI CA VOMS HPSS-DSI SRB-DSI KEK-1 NAREGI KEK-1 NAREGI KEK-2 DMZ CC-GRIDGRID-LAN UI Intranet Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid Univ. Grid LFC

VO Component March 17, 20092nd Open Meeting of the SuperKEKB KEK LFCVOMS RBSRMCEIS Virtual Organization Central Services for Belle Looking up logical file name Logon VO & privilege mgt Req./Booking comp. res. Unified IF into the storage equip. Unified IF into the LRMS Service/Resource discovery SRB-DSI SRB HSM Belle VO specific arch. SRM 7 Directory access by TURL, not using any logical namespace so far GridFTP Inside B-Comp. System

Brief Summary of LCG Deployment JP-KEK-CRC-01 (KEK-1) ▸Production in GOC since Nov 2005 ▸Mainly operated by KEK staffs ▸Site Role: ▹Practical operation for KEK-2 ▹Getting’ started for university groups ▸Resource and Component: ▹SL-3x or SL-4x ▹gLite-3.X ▹CPU: 14 ▹Storage: ~7TB for disk and DSI for HSM, HPSS ▹Fully functional services ▸Supported VOs: ▹belle apdg ail g4med dteam ops ppj ilc calice naokek JP-KEK-CRC-02 (KEK-2) ▸Production in GOC since early 2006 ▸Site operation: ▹Manabu and Kohki ▸Site Role: ▹More stable services based on KEK-1 experiences. ▸Resource and Component: ▹SL-3x or SL-4x ▹gLite-3.X ▹CPU: 48 ▹Storage: ~1TB for disk ▹Fully functional services ▸Supported VOs: ▹belle apdg ail g4med dteam ops ppj ilc calice naokek March 17, 20092nd Open Meeting of the SuperKEKB KEK8 In plan to upgrade In Q1 of FY WNs x 8CPUs x ~4kSI2K Storage capability on demand basis HPSS virtually works as the backend disk of SE ~200USD/1TB VM (Xen) technologies are widely supported in whole site Higher availability & more robustness Old blade servers (B-Comp) are now being integrated with KEK x 2CPUs Fully upgrade and up soon!

LCG Infrastructure Deployment Status ▸55 countries ▸265 sites ▸88K CPUs ▸130M SI2K ▸480 PB available ▸580 PB in use March 17, 20092nd Open Meeting of the SuperKEKB KEK ~10MSI2K As of Dec

Resource Deployment over belle VO March 17, 20092nd Open Meeting of the SuperKEKB KEK10 10MSI2K FZK-LCG2 Note: Still missing KEK-2 ▸18M SI2K/9k CPUs ▹~10% of whole production resources ▸Storage through SRMs ▹27 TB available ▹83 GB in use ▸HSM storage in KEK Belle Computing System through SRB ▹~100 TB (ask Nakazawa-san in detail)

Institutes ▸IFJ PAN (CYFRONET) (Poland) ▸Univ. of Melbourne (Australia) ▸KEK (Japan) ▸National Central Univ. (Taiwan) ▸ASGC (Taiwan) ▸Nagoya University (Japan) ▸KISTI (Korea) ▸Univ. of Karlsruhe (Germany) ▸Jozef Stefan Institute (Slovenia) ▸Panjab Univ. (India) ▸Virginia Polytechnic Inst. State Univ. (US) ▸Univ. of Hawaii (US) ▸Wayne State University (US) ▸Korea Univ. (Korea) ▸Univ. of Sydney (Australia) March 17, 20092nd Open Meeting of the SuperKEKB KEK11 VO-scale is being expanded slowly Federation done

Members ▸/C=JP/O=KEK/OU=CRC/OU=KEK/CN=Nishida Shohei ▸/C=JP/O=KEK/OU=CRC/OU=KEK/CN=Yoshimi Iida ▸/C=JP/O=KEK/OU=CRC/CN=Go Iwai ▸/C=JP/O=KEK/OU=CRC/OU=Nagoya/CN=Yuko Nishio ▸/C=AU/O=APACGrid/OU=The University of Melbourne/CN=Glenn R. Moloney ▸/C=JP/O=KEK/OU=CRC/OU=Korea University/CN=Hyuncheong Ha ▸/C=JP/O=KEK/OU=CRC/CN=Yoshiyuki WATASE ▸/C=JP/O=KEK/OU=CRC/CN=Hideyuki Nakazawa ▸/C=JP/O=KEK/OU=CRC/CN=YAMADA Kenji ▸/C=JP/O=KEK/OU=CRC/OU=Nagoya university HEPL/CN=kenji inami ▸/C=JP/O=KEK/OU=CRC/OU=Nagoya university HEPL/CN=Mitsuhiro Kaga ▸/C=JP/O=KEK/OU=CRC/CN=Jun Ebihara ▸/C=JP/O=KEK/OU=CRC/OU=Korea University/CN=Soohyung Lee ▸/C=JP/O=KEK/OU=CRC/CN=Manabu Matsui ▸/C=SI/O=SiGNET/O=IJS/OU=F9/CN=Marko Bracko ▸/C=JP/O=KEK/OU=CRC/CN=Kenn Sakai ▸/C=JP/O=KEK/OU=CRC/CN=Yugawa Takahiro ▸/C=JP/O=KEK/OU=CRC/CN=Yamada Chisato ▸/O=GermanGrid/OU=Uni Karlsruhe/CN=Thomas Kuhr ▸/O=GermanGrid/OU=FZK/CN=Dimitri Nilsen ▸/C=KR/O=KISTI/O=GRID/O=KISTI/CN= Beob Kyum Kim ▸/C=JP/O=KEK/OU=CRC/CN=Shunsuke Takahashi March 17, 20092nd Open Meeting of the SuperKEKB KEK12 90% of 22 persons is ops staff

Ops Stats: JFY2006 & JFY2007 March 17, 20092nd Open Meeting of the SuperKEKB KEK 100 kJobs 300 kHrs Submitted KEK-2 & KEK-2 CPU KEK-1 & KEK-2 13

Ops Stats: JFY2008 March 17, 20092nd Open Meeting of the SuperKEKB KEK14 18 kJobs 170 khrs Now

Service Availability Jan-Dec 2008 ▸127H/4SD ▸12 tickets were opened, but solved all ▸931H/12SD ▸13 tickets were opened, but solved all March 17, 20092nd Open Meeting of the SuperKEKB KEK More than 90% availability in 2008 ! 15

H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Recent Activities SAGA-A Simple API for Grid Applications ▸Motivation ▸Goal ▸Current Status 16

Grid Deployment at KEK Middleware/Experiment Matrix gLiteNAREGIGfarmSRBiRODS BelleUsingPlanningUsing AtlasUsing Radio therapyUsingDevelopingPlanning ILCUsingPlanning J-PARCPlanning Testing Super-BelleTo be decided by 2010 ▸Commonly most of experiment or federation are using gLite as the Grid middleware. ▸NAREGI middleware is being deployed as the general purpose e-science infrastructure in Japan ▹Difficulties: e.g. human costs, time differences ▹Both interops among MWs are mandate for us (next a few slides) ▪To provide higher availability, reliability and to keep prod. quality March 17, nd Open Meeting of the SuperKEKB KEK

Issues on Multi Middleware Apps ▸For site admins: ▹Dedicate HW is deployed in each middleware ▪LRMS ▪OS ▸For end users: ▹By ordinal way, same apps for each middle are developed to be enabled on Grid ▹They have to know which middleware they are using. gLiteNAREGI SRB iRODS CPUs Storage CPUs Storage Deployed dedicate HW App Users should be aware the underlying middleware-layer and hardware deployed March 17, nd Open Meeting of the SuperKEKB KEK

SAGA-Engine Motivation SAGA adaptors Adpt Applications ▸We need to operate multi Grid middleware at the same time. ▹Resource sharing among them is mandate ▪We are also contributing to GIN ▸Virtualization of Grid middleware is our wish ▹The best scenario for the application developers Today’s topic for SAGA-NAREGI March 17, nd Open Meeting of the SuperKEKB KEK gLiteNAREGI SRB iRODS CPUs Storage Cloud LRMS LSF/PBS/SGE/… GIN/PGI: Multi-Middleware Layer Fair share resources among middles

Project Goal (Funded collaboration with NII) SAGA-Engine gLiteNAREGI SRB iRODS SAGA adaptors Adpt C++ Interface Python Binding Svc Apps RNS FC service based on OGF standard CPUs Storage 1.Middleware-transparent layer 2.Middleware-independent services March 17, nd Open Meeting of the SuperKEKB KEK Cloud LRMS LSF/PBS/SGE/… 1. GIN/PGI: Multi-Middleware Layer 2. SAGA supports cloud & LRMS for local clusters

Current Status SAGA-NAREGI Adaptor ▸Only for the job adaptor ▸Succeed to submit a job in NAREGI and to retrieve results saga::job::description jd; jd.set_attribute(sja::description_executable, "/bin/hostname"); jd.set_attribute(sja::description_working_directory, "/some/where/work/dir"); jd.set_attribute(sja::description_output, “std.out"); jd.set_attribute(sja::description_error, "std.err"); std::vector ft; ft.push_back("gsiftp://gfarm.cc.kek.jp/my/file.in > file.in"); ft.push_back("gsiftp://gfarm.cc.kek.jp/my/file.out < file.out"); jd.set_vector_attribute(sja::description_file_transfer, ft); saga::job::service js("naregi://nrgvms.cc.kek.jp"); saga::job::job j = js.create_job(jd); j.run(); while (j.get_state() != saga::job::Done) { std::cout << j.get_attribute(“JobID”) << std::endl; sleep(1); } Job description File stage-in/out Job Submission March 17, nd Open Meeting of the SuperKEKB KEK

H IGH E NERGY A CCELERATOR R ESEARCH O RGANIZATION KEKKEK Summary 22

Summary ▸Belle VO has being expanded ▹9 institutes and 22 users ▹18M SI2K/9k CPUs ▹27 TB available, 83 GB in use through SRMs ▪HSM in KEK Belle Computing System Is used through SRB. ▹KEK-2 will come up very soon ▪Final state to pass certificate process ▸SAGA-NAREGI ready for use ▹Only for job adaptor currently ▹SAGA-PBS is now being developed and will be released soon (in March 2009) ▹This project has been funded for 3.5 years and end in March March 17, 20092nd Open Meeting of the SuperKEKB KEK23

References ▸SAGA ▹ ▹ ▸VOMS end point ▹ ▸VO setting parameters ▹ ▸VO ID card ▹ page=&vo=bellehttps://cic.gridops.org/index.php?section=vo&page=homepage&sub page=&vo=belle ▸VOMS certificate ▹ ▹ certificate&vo=belle&vomsserver=voms.kek.jphttps://cic.gridops.org/downloadRP.php?section=database&rpname= certificate&vo=belle&vomsserver=voms.kek.jp ▸VOMS.kek.jp will move to VOMS.cc.kek.jp in next power cut August March 17, 20092nd Open Meeting of the SuperKEKB KEK24