Presentation is loading. Please wait.

Presentation is loading. Please wait.

Workshop KEK - CC-IN2P3 LCG update and plan at KEK Go Iwai, KEK/CRC 27 – 29 Oct. CC-IN2P3, Lyon, France Day1 15:25 - 16:05 (40min)

Similar presentations


Presentation on theme: "Workshop KEK - CC-IN2P3 LCG update and plan at KEK Go Iwai, KEK/CRC 27 – 29 Oct. CC-IN2P3, Lyon, France Day1 15:25 - 16:05 (40min)"— Presentation transcript:

1 Workshop KEK - CC-IN2P3 LCG update and plan at KEK Go Iwai, KEK/CRC 27 – 29 Oct. 2007 @ CC-IN2P3, Lyon, France Day1 15:25 - 16:05 (40min)

2 Go Iwai, KEK/CRC 2Agenda Introduction Introduction Grid at KEK, its role and stance, etc Grid at KEK, its role and stance, etc Brief history and summary on Grid Brief history and summary on Grid Ops stats Ops stats Grid related services Grid related services Networking (SINET3) Networking (SINET3) Particular Activities over the VO Particular Activities over the VO Belle experiments Belle experiments Accelerator Science over the SINET3 Accelerator Science over the SINET3 ILC experiments ILC experiments Geant4 Medical Application Geant4 Medical Application NAREGI NAREGI Beta-1: Activities at KEK Beta-1: Activities at KEK Beta-2: Current status Beta-2: Current status Summary Summary 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

3 Go Iwai, KEK/CRC 27 November 2007 Introduction - Grid at KEK, its role and stance, etc - Brief history and summary on Grid - Ops stats - Grid related services - Networking (SINET3)

4 Go Iwai, KEK/CRC 4 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK KEK’s role, stance, etc KEK-EGEE II Collaboration KEK-EGEE II Collaboration Focusing on operation and management (SA1) Focusing on operation and management (SA1) Many supports by ASGC, Asia-Pacific ROC Many supports by ASGC, Asia-Pacific ROC Domestic support Domestic support KEK has a role to offer necessary assistance to university groups in Japan KEK has a role to offer necessary assistance to university groups in Japan They are interested in Grid, but most of them are not production site They are interested in Grid, but most of them are not production site They hesitate to participate in EGEE? They hesitate to participate in EGEE? Mostly, graduate students in physics are the main human resource to support IT unfortunately Mostly, graduate students in physics are the main human resource to support IT unfortunately Support for their deployment, operation and monitoring is a role instead of ROC Support for their deployment, operation and monitoring is a role instead of ROC We are T2 center, but not for any LHC exp. We are T2 center, but not for any LHC exp. T2 for ATLAS: ICEPP, Univ. of Tokyo T2 for ATLAS: ICEPP, Univ. of Tokyo T2 for ALICE: Hiroshima University T2 for ALICE: Hiroshima University Currently, our activities on Grid are focused on operation and management, and developing gridified application over the LCG infrastructure, but not for LHC experiments Currently, our activities on Grid are focused on operation and management, and developing gridified application over the LCG infrastructure, but not for LHC experiments Encourage users to use the Grid Encourage users to use the Grid Just circumstance only at KEK, not at other institutions Just circumstance only at KEK, not at other institutions

5 Go Iwai, KEK/CRC 5 People on Grid at KEK, CRC 7 persons in total 7 persons in total Today’s participants are underlined Today’s participants are underlined CA CA T. Sasaki and Y. Iida T. Sasaki and Y. Iida VOMS VOMS Y. Watase and G. Iwai Y. Watase and G. Iwai Site Operation and Security Site Operation and Security KEK-1 KEK-1 T. Sasaki, Y. Iida, Y. Watase and G. Iwai T. Sasaki, Y. Iida, Y. Watase and G. Iwai KEK-2 KEK-2 T. Sasaki, Y. Watase, and G. Iwai T. Sasaki, Y. Watase, and G. Iwai NAREGI NAREGI Y. Watase, and G. Iwai Y. Watase, and G. Iwai Deployment Deployment Y. Watase, Y. Iida and G. Iwai Y. Watase, Y. Iida and G. Iwai Documentation Documentation Y. Watase Y. Watase Networking Networking S. Suzuki, S. Yashiro and Y. Iida S. Suzuki, S. Yashiro and Y. Iida Application (SRB, Portal and some Gridified applications) Application (SRB, Portal and some Gridified applications) K. Murakami, Y. Iida and G. Iwai K. Murakami, Y. Iida and G. Iwai 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

6 Go Iwai, KEK/CRC 6 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Brief History on Grid 2005 Q1(Jan-Mar) Q2(Apr-Jun) Testbed project was started with LCG-2.6 - Federation among a few institutions - Federation among a few institutions Q3(Jul-Sep) Q4(Oct-Dec) “HEP Data Grid Workshop” was held at KEK - KEK-1 system were introduced based on experience in workshop - KEK-1 system were introduced based on experience in workshop 2006 Q1 KEK Grid CA: accredited as production JP-KEK-CRC-01: approved as a certified site Q2 JP-KEK-CRC-02 (LCG-2.7): approved as a certified site NAREGI beta1 released Q3 1 st KEK-IN2P3 workshop at IN2P3 KEK-1 and KEK-2 was upgraded to gLite-3.0 Q4 2007 Q1 2 nd KEK-IN2P3 workshop at KEK Q2 Q3 NAREGI beta2 released Q4 3 rd KEK-IN2P3 workshop at IN2P3

7 7 Go Iwai, KEK/CRC Brief Summary of Grid Deployment 2 sites are in operation 2 sites are in operation Deployed in different network logically Deployed in different network logically JP-KEK-CRC-01 JP-KEK-CRC-01 Since Nov 2005 Since Nov 2005 Usage: experimental use and R&D, but production in LCG framework Usage: experimental use and R&D, but production in LCG framework JP-KEK-CRC-02 JP-KEK-CRC-02 Since Jan 2006 Since Jan 2006 More stable services based on experience at KEK-1 More stable services based on experience at KEK-1 NAREGI NAREGI Using NAREGI beta1 released on May 2006. Using NAREGI beta1 released on May 2006. Testing and evaluation, what is lack for our usage? Testing and evaluation, what is lack for our usage? Accepted VOs are Accepted VOs are belle belle ppj ppj ilc, calice ilc, calice g4med g4med dteam, ops dteam, ops apdg, ail (have been gone) apdg, ail (have been gone) KEK FW JP-KEK-CRC-02JP-KEK-CRC-02 SINETSINET KEK-LANKEK-LAN JP-KEK-CRC-01JP-KEK-CRC-01 NAREGINAREGI 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

8 Go Iwai, KEK/CRC 8 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK KEK-1: JP-KEK-CRC-01 SL30X/gLite-3.0 SL30X/gLite-3.0 still not migration to SL4 still not migration to SL4 CPU: 14 x P4 CPU: 14 x P4 Storage: ~1.5TB Storage: ~1.5TB dg09169dg09169dg10170dg10170dg11171dg11171dg12172dg12172dg13173dg13173dg14174dg14174dg15175dg15175dg16176dg16176 dg17177dg17177dg18178dg18178dg19179dg19179dg20180dg20180 wn001101wn001101 wn001101wn001101 wn001101wn001101 wn001101wn001101 wn001101wn001101 wn002102wn002102 wn001101wn001101 wn001101wn001101 wn003104wn003104 192.168.1.0/24 wn001 - wn015 (192.168.1.xxx) dg08168dg08168FTSRBCESEMONWMS/LBUIBDII/PXLFC VOBOXSEWEBDB WNs KEK-LANKEK-LAN KEK FW 130.87.208.0/22 KEK-2KEK-2 SINETSINET KEKLANKEKLAN KEK-1KEK-1NAREGINAREGI dgXX.cc.kek.jp 130.87.208.xxx dgXX.cc.kek.jp 130.87.208.xxx Service Name

9 Go Iwai, KEK/CRC 9 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK KEK-2: JP-KEK-CRC-02 SL30X/gLite-3.0 SL30X/gLite-3.0 still not migration to SL4 still not migration to SL4 CPU: 48 x Opteron252 CPU: 48 x Opteron252 Storage: ~2.0TB Storage: ~2.0TB Only disk in use, HPSS exclusive Only disk in use, HPSS exclusive rls02102rls02102 rls03 103 103rls04104rls04104rls05105rls05105rls06106rls06106rls07107rls07107rls08108rls08108 wn001101wn001101 wn001101wn001101 rlc01 51 rlc01 51 wn001101wn001101 wn001101wn001101 rlc02 52 rlc02 52 wn001101wn001101 wn001101wn001101 rlc03 53 53 10.32.2.0/24 rlc01 - rlc24 (10.32.2.xxx) rls01101rls01101 CECERBBDIILFCSEMON SINETSINET L3 SW 202.13.197.0/24 WNs wn001101wn001101 wn001101wn001101 rlc04 54 54 wn001101wn001101 wn001101wn001101 rlc05 55 55 * 2 processors on each rlsXX.cc.kek.jp 202.13.197.xxx rlsXX.cc.kek.jp 202.13.197.xxx Service Name KEK FW KEK-2KEK-2 SINETSINET KEKLANKEKLAN KEK-1KEK-1NAREGINAREGI

10 Go Iwai, KEK/CRC 10 Ops Stats at KEK-1 last 2yrs Number of Jobs (w/o ops, dteam): 7,677 Number of Jobs (w/o ops, dteam): 7,677 33,140 CPU time normalized by 1kSI2K (hrs*kSI2K) 33,140 CPU time normalized by 1kSI2K (hrs*kSI2K) JP-KEK-CRC-01JP-KEK-CRC-01 200520062007 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

11 Go Iwai, KEK/CRC 11 Ops Stats at KEK-2 last 2yrs Number of Jobs: (w/o ops, dteam) 11,951 Number of Jobs: (w/o ops, dteam) 11,951 142,960 CPU time normalized by 1kSI2K (hrs*kSI2K) 142,960 CPU time normalized by 1kSI2K (hrs*kSI2K) JP-KEK-CRC-02JP-KEK-CRC-02 200520062007 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

12 12 Go Iwai, KEK/CRC Workshop KEK - CC-IN2P3: LCG update and plan at KEK 27 November 2007 KEK Grid CA KEK Grid CA has been started since Jan 2006. KEK Grid CA has been started since Jan 2006. 75 CAs are in production all over the world, 3 in Japan. 75 CAs are in production all over the world, 3 in Japan. AIST, NAREGI, and KEK AIST, NAREGI, and KEK is accredited by IGTF (International Grid Trust Federation) is accredited by IGTF (International Grid Trust Federation) The IGTF consists of APGridPMA (AP), EUGridPMA (EU) and TAGPMA (US). The IGTF consists of APGridPMA (AP), EUGridPMA (EU) and TAGPMA (US). is recognized by LCG also. is recognized by LCG also. KEK Grid CA has been audited by Yoshio Tanaka (chair of APGridPMA), AIST on May 2007 and passed KEK Grid CA has been audited by Yoshio Tanaka (chair of APGridPMA), AIST on May 2007 and passed Audited NAREGI CA also for cross checking on July 2007 Audited NAREGI CA also for cross checking on July 2007JFY2006 Apr 2006 - Mar 2007 JFY2007 Apr 2007 - Globus Client Certificate (Personal cert.) 68119 Globus Server Certificate (Host cert.) 139238 Web Server Certificate 40 KEK Grid CA: Statistics of Issued Certificates http://gridca.kek.jp

13 Go Iwai, KEK/CRC 13 VOMS operated at KEK VOMS has been serviced in production since Sep 2006. VOMS has been serviced in production since Sep 2006. Tested from Nov 2005. Tested from Nov 2005. VOMS support the VO for VOMS support the VO for BELLE: Belle Experiments (belle only registered in CIC) BELLE: Belle Experiments (belle only registered in CIC) The biggest target for us The biggest target for us PPJ: Accelerator Science in Japan PPJ: Accelerator Science in Japan G4MED: Geant4 Medical Application for Radiotherapy G4MED: Geant4 Medical Application for Radiotherapy APDG: The R&D of Data Grid among Asia-Pacific region APDG: The R&D of Data Grid among Asia-Pacific region ATLASJ: The ATLAS Experiment only for Japanese Group ATLASJ: The ATLAS Experiment only for Japanese Group AIL: Associated International Laboratory between KEK and France AIL: Associated International Laboratory between KEK and France 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK http://voms.kek.jp

14 Go Iwai, KEK/CRC 14 Networking on Grid The production R&E networks, “SuperSINET/SINET” operated by NII has been upgraded to “SINET3” on April 2007 The production R&E networks, “SuperSINET/SINET” operated by NII has been upgraded to “SINET3” on April 2007 Two tier structures with edge and core layers. Two tier structures with edge and core layers. The edge layer consists of edge layer 1 switches with Ethernet interfaces to accommodate users’ equipment. The edge layer consists of edge layer 1 switches with Ethernet interfaces to accommodate users’ equipment. The core layer consists of core layer 1 switches and high performance IP routers and constitutes a nationwide reliable backbone network. The core layer consists of core layer 1 switches and high performance IP routers and constitutes a nationwide reliable backbone network. 63 edge nodes and 12 core nodes, i.e., 75 L1 switches and 12 IP/MPLS routers 63 edge nodes and 12 core nodes, i.e., 75 L1 switches and 12 IP/MPLS routers 2x10GbE for KEK 2x10GbE for KEK The line speed between the edge and core nodes is 1 to 20 Gbps, and the backbone line speed between the core nodes is a maximum of 40 Gbps. The line speed between the edge and core nodes is 1 to 20 Gbps, and the backbone line speed between the core nodes is a maximum of 40 Gbps. Shift to more application oriented rather than the band width Shift to more application oriented rather than the band width GRID deployment is an issue GRID deployment is an issue Virtual Organization for HEP in Japan Virtual Organization for HEP in Japan : Edge node (w/ edge L1 SW) : Core node (w/ core L1 SW + IP router) (w/ core L1 SW + IP router) : 1G to 10G : 10G to 40G 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

15 Go Iwai, KEK/CRC Particular Activities over the VO - Belle experiments - Accelerator Science over the SINET3 - ILC experiments - Geant4 Medical Application 27 November 2007

16 16 Go Iwai, KEK/CRC Workshop KEK - CC-IN2P3: LCG update and plan at KEK 27 November 2007 BELLE: The VO for the Belle Exp. Belle VO is federated among 5 countries, 7 institutes, 10 sites. Belle VO is federated among 5 countries, 7 institutes, 10 sites. Nagoya University, University of Melbourne, ASGC, NCU, CYFRONET, Korea University Nagoya University, University of Melbourne, ASGC, NCU, CYFRONET, Korea University KEK-1/2 are also KEK-1/2 are also VOMS is supported by KEK VOMS is supported by KEK http://voms.kek.jp/ http://voms.kek.jp/ http://voms.kek.jp/ Past Activities Past Activities Federation among sites Federation among sites Library installation Library installation Submitting MC production job for more realistic use Submitting MC production job for more realistic use Long-term jobs, MC is taken ~1 week usually Long-term jobs, MC is taken ~1 week usually Functional tests and performance tests over the VO Functional tests and performance tests over the VO Finding a way to access existing data Finding a way to access existing data SRB (next slide) SRB (next slide) Univ. of Melbourne Australia Australia NCUTaiwanNCUTaiwan ASGCTaiwanASGCTaiwan Nagoya Univ. Japan Japan CYFRONETPolandCYFRONETPoland Korea Univ. Korea Korea KEKJapanKEKJapan

17 Go Iwai, KEK/CRC 17 The choice and design by using SRB- DSI A user can access by just using GridFTP client from out side A user can access by just using GridFTP client from out side SRB client can access HSM also from inside. SRB client can access HSM also from inside. Benefit for both SRB user and LCG user Benefit for both SRB user and LCG user Both user can read and write from/to HSM without considering protocol. Both user can read and write from/to HSM without considering protocol. Both protocols are authorized by GSI Both protocols are authorized by GSI MelbourneMelbourne NCUNCUASGCASGC KUKU NagoyaNagoya CYFRONETCYFRONET KEK-2KEK-2 KEK-1KEK-1 SINETSINET KEK-LANKEK-LAN GEANT2GEANT2 APANAPAN KEK-DMZKEK-DMZ SRB-DSISRB-DSI KEK FW HSMHSM MCATMCAT PluggableExtension SRB client SRB server 3.5PB GridFTP SRB NFSReferenceRegistration Enhanced GridFTP service 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Belle Net dedicated inside Belle Net dedicated inside LCG user Computing Farm Still not integration with Grid LSF LCG user Belle analysis user

18 Go Iwai, KEK/CRC 18 Ops Stats over the Belle VO last 2yrs Number of Jobs: 12,417 Number of Jobs: 12,417 7,706 of 12,417 has been processed at KEK-1/2 7,706 of 12,417 has been processed at KEK-1/2 220,809 CPU time normalized by 1kSI2K (hrs*kSI2K) 220,809 CPU time normalized by 1kSI2K (hrs*kSI2K) 122,032 of 220,809 has been used at KEK-1/2 122,032 of 220,809 has been used at KEK-1/2 200520062007 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

19 19 Go Iwai, KEK/CRC Workshop KEK - CC-IN2P3: LCG update and plan at KEK 27 November 2007 Computing Resources over the Belle VO CPU resources over the belle VO as a function of sites Storage capacities over the belle VO as a function of sites ■: used □: available 22.5TB in total 5.94TB in use ■: used □: available 22.5TB in total 5.94TB in use 883 cores, 1,260kSI2K in total

20 Go Iwai, KEK/CRC 20 PPJ: The VO for Accelerator Science in Japan Federated among major university groups and KEK only in Japan. Federated among major university groups and KEK only in Japan. Tohoku-U(KAMLAND, ILC) Tohoku-U(KAMLAND, ILC) Tsukuba-U(CDF) Tsukuba-U(CDF) Nagoya-U(BELLE, ATLAS) Nagoya-U(BELLE, ATLAS) Kobe-U(ILC, ATLAS) Kobe-U(ILC, ATLAS) Hiroshima-IT(ATLAS, Computing Science) Hiroshima-IT(ATLAS, Computing Science) We have a common VO, but do NOT depend on scientific projects. We have a common VO, but do NOT depend on scientific projects. To test each site. To test each site. KEK assists their operation over the this VO KEK assists their operation over the this VO same motivation with ops VO same motivation with ops VO Tohoku U KEK Tsukuba U Nagoya U Kobe U HITTotal CPU (kSI2K) 0.68 91 5.1 8.3 8.58.58.58.5 1.2 115 SE (GB) 1502,6766515068363,145 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Tohoku Univ. KEK Univ. of Tsukuba Nagoya Univ. Kobe Univ. Hiroshima IT Computing Resources over the PPJ VO

21 Go Iwai, KEK/CRC 21 The Supporting System The monitoring system based on nagios and wiki has been developed over the PPJ. The monitoring system based on nagios and wiki has been developed over the PPJ. To support their operation at the university. To support their operation at the university. The monitoring portal creates a link automatically based on knowledgebase and navigates administrators to appropriate troubleshooting page on wiki. The monitoring portal creates a link automatically based on knowledgebase and navigates administrators to appropriate troubleshooting page on wiki. Mineo Fujii, Hiroshima-IT 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Monitoring Portal A few screenshots of supporting system: consists of “monitoring system” and “knowledge DB” and “FAQ by wiki” A screenshot of summary view: some of results are summarized and shown on the map as a part of monitoring system. Each site is iconified and shown their status as a few color, e.g., yellow show “warning”, red show “error”. The thickness and color of line indicates RTT and network status. A screenshots of monitoring system: The site status is checked by a few simple jobs or commands, and is listed here. Link to FAQ is generated as to error description.

22 Go Iwai, KEK/CRC 22 ILC/CALICE: The VO for Linear Collider Exp. ILC/CALICE supported by DESY has bean ready at KEK ILC/CALICE supported by DESY has bean ready at KEK Since end of 2006 Since end of 2006 Initial goal is achieved. Initial goal is achieved. Triangle file sharing/transfer among DESY, IN2P3 and KEK over the VO Triangle file sharing/transfer among DESY, IN2P3 and KEK over the VO ILC ILC Number of cores: 32,793 Number of cores: 32,793 SPEC: 35,384 kSI2K SPEC: 35,384 kSI2K Storage: 68.4TB (12.6TB in use) Storage: 68.4TB (12.6TB in use) Members: 69 (4 from Japan) Members: 69 (4 from Japan) Calice Calice Number of cores: 13,469 Number of cores: 13,469 SPEC: 15,140 kSI2K SPEC: 15,140 kSI2K Storage: 203TB (15.6TB in use) Storage: 203TB (15.6TB in use) Members: 52 (3 from Japan) Members: 52 (3 from Japan) 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

23 23 Go Iwai, KEK/CRC Ops Stats Last 2yrs ILCILC Number of Jobs: 150,269 Number of Jobs: 150,269 955 of 150,269 has been processed at KEK-1/2 955 of 150,269 has been processed at KEK-1/2 323,251 CPU time normalized by 1kSI2K (hrs*kSI2K) 323,251 CPU time normalized by 1kSI2K (hrs*kSI2K) 569 of 323,251 has been used at KEK-1/2 569 of 323,251 has been used at KEK-1/2 CALICECALICE Number of Jobs: 145,776 Number of Jobs: 145,776 579 of 145,776 has been processed at KEK-1/2 579 of 145,776 has been processed at KEK-1/2 338,531 CPU time normalized by 1kSI2K (hrs*kSI2K) 338,531 CPU time normalized by 1kSI2K (hrs*kSI2K) 1,061 of 338,531 has been used at KEK-1/2 1,061 of 338,531 has been used at KEK-1/2 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

24 Go Iwai, KEK/CRC 24 G4MED: Geant4 Medical Application We provide the framework and software toolkit for simulation in radiotherapy We provide the framework and software toolkit for simulation in radiotherapy This software has following functions This software has following functions CT/MRI data conversion to the Geant4 simulation CT/MRI data conversion to the Geant4 simulation Simulation load sharing and data sharing with data grid technology Simulation load sharing and data sharing with data grid technology Visualization Visualization Interactivity Interactivity Web based interface Web based interface Hospital is strongly protected by FW Hospital is strongly protected by FW etc etc 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

25 Go Iwai, KEK/CRC 25 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Screenshots of Web Interface Web Interface has been developed as the first practical implementation. Web Interface has been developed as the first practical implementation. Job control and getting/viewing results are available only with web browser without any rich client. Job control and getting/viewing results are available only with web browser without any rich client. Keep the developing to improve the usability and to integrate into other components. Keep the developing to improve the usability and to integrate into other components. Login view Proxy cert Creation Job Status View Job History View

26 Go Iwai, KEK/CRC 26 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK High priority issues in LCG CE and WN CE and WN Migration to SL4 (WN, lcg-CE) Migration to SL4 (WN, lcg-CE) Higher performance for the application Higher performance for the application Queue settings Queue settings Always occupied by jobs, sometime jobs from ops are expired Always occupied by jobs, sometime jobs from ops are expired Migration to LSF Migration to LSF Sharing resources with local user and grid user Sharing resources with local user and grid user Most of batch system at KEK are using LSF Most of batch system at KEK are using LSF Most of staffs have many knowhow Most of staffs have many knowhow Maui is a bit poor Maui is a bit poor SE integration SE integration Currently only disk in use Currently only disk in use A bit poor for us A bit poor for us We want larger scale storage, e.g. TAPE system We want larger scale storage, e.g. TAPE system We are using DPM/SRM as a head node of SE We are using DPM/SRM as a head node of SE Established to access HSM by using SRB-DSI, hopefully HPSS by using HPSS-DSI soon Established to access HSM by using SRB-DSI, hopefully HPSS by using HPSS-DSI soon We can access to both SRB-DSI and HPSS-DSI via GridFTP, but not use LFC We can access to both SRB-DSI and HPSS-DSI via GridFTP, but not use LFC Keep the contact with application team at ASGC Keep the contact with application team at ASGC Networking/Security Networking/Security Always tradeoff relationship between convenience and security Always tradeoff relationship between convenience and security Always important subject how to manage easily and quickly with security assurance Always important subject how to manage easily and quickly with security assurance More robust and higher performance services More robust and higher performance services Using VM Using VM Redundant design Redundant design How? round robin DNS? How? round robin DNS?

27 Go Iwai, KEK/CRC NAREGI - Beta-1: Activities at KEK - Beta-2: Current status 27 November 2007

28 Go Iwai, KEK/CRC 28 Introduction to NAREGI NAREGI: NAtional REsearch Grid Initiative NAREGI: NAtional REsearch Grid Initiative Foundation: 2003-2007 10 billion Yen for 5 years (Initial plan) Foundation: 2003-2007 10 billion Yen for 5 years (Initial plan) extended to 2009 extended to 2009 Host institute: National Institute of Informatics (NII) Host institute: National Institute of Informatics (NII) Core collaborators: Core collaborators: IMS(molecular science), AIST (industrial app.), TIT, Osaka, Hitachi, Fujitsu, NEC IMS(molecular science), AIST (industrial app.), TIT, Osaka, Hitachi, Fujitsu, NEC Mission: Mission: R&D of the Grid middleware for research and industrial application toward the advanced infrastructure R&D of the Grid middleware for research and industrial application toward the advanced infrastructure Primary target application is nano technology for innovative and intelligent materials production. Primary target application is nano technology for innovative and intelligent materials production. More focused in the computing grid for linking supercomputer centers for coupled simulation of multi-scale physics More focused in the computing grid for linking supercomputer centers for coupled simulation of multi-scale physics Support heterogeneous computer architectures (vector & super parallel & clusters) Support heterogeneous computer architectures (vector & super parallel & clusters) Data grid part were integrated in 2005 Data grid part were integrated in 2005 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

29 Go Iwai, KEK/CRC 29 NAREGI Architecture Complete set of Grid middleware Complete set of Grid middleware Advanced implementation of the OGF (GGF) standards to contribute for Grid world Advanced implementation of the OGF (GGF) standards to contribute for Grid world OGSA-WSRF(WebServiceResoureFramework) compliant architecture based on Globus ToolKits 4.0 (GT2 in gLite) OGSA-WSRF(WebServiceResoureFramework) compliant architecture based on Globus ToolKits 4.0 (GT2 in gLite) provides Web service interface components for various services provides Web service interface components for various services not only resource brokering. resource reservation, co-allocation, and co-scheduling not only resource brokering. resource reservation, co-allocation, and co-scheduling GGF JDSL 1.0 for job submission (JDL in gLite) GGF JDSL 1.0 for job submission (JDL in gLite) Resource Information based on CIM standard schema (GLUE schema in gLite) Resource Information based on CIM standard schema (GLUE schema in gLite) Security Security adopt VOMS/EGEE adopt VOMS/EGEE Information service Information service Service interface: OGSA-DAI (Data Access & Integration) Service interface: OGSA-DAI (Data Access & Integration) Adopt many standards Adopt many standards NAREGI beta-1 released in May 2006 NAREGI beta-1 released in May 2006 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

30 Go Iwai, KEK/CRC 30 NAREGI-beta1 at KEK Testbed: 9 server nodes & 5 compute nodes Testbed: 9 server nodes & 5 compute nodes Middleware installation Middleware installation NAREGI-beta 1.0.1: Jun. 2006 - Nov. 2006 NAREGI-beta 1.0.1: Jun. 2006 - Nov. 2006 Manual installation for all the steps Manual installation for all the steps Confirmed functionalities of Information Service, PSE (Problem Solving Environment), WFT (GUI Workflow Tool), GVS (Grid Visualization System) Confirmed functionalities of Information Service, PSE (Problem Solving Environment), WFT (GUI Workflow Tool), GVS (Grid Visualization System) NAREGI-beta 1.0.2: Feb 2007 NAREGI-beta 1.0.2: Feb 2007 DG comprehensive installation manual was released in Jan. 2007 DG comprehensive installation manual was released in Jan. 2007 Site federation test Sep. 2007 Site federation test Sep. 2007 KEK-NAREGI/NII KEK-NAREGI/NII Job Tests with Application software Job Tests with Application software Data analysis of ion scattering experiment Data analysis of ion scattering experiment Geant4 simulation with MPI Geant4 simulation with MPI Belle event simulation Belle event simulation 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

31 31 Go Iwai, KEK/CRC KEK FW KEK-2KEK-2 SINETSINET KEKLANKEKLAN KEK-1KEK-1NAREGINAREGI Portal SS GridVM Ser IS Private network(.keknrg.jp) Gfarm NAT/DNSAMS. AMS AMS 192.168.2.4: gvms 192.168.2.6: gwy 192.168.2.101~ nrg00nrg02nrg01nrg04nrg09nrg10nrg11nrg06nrg03 MDS GridVM Eng SINET KEK Site Portal Iinfo Serv SuperSched GridVM Serv MetaData Serv AccessManageGfarm 192.168.2.11 gfms NAT/DNSUMS/VOMS UMS/VOMS Configuration of NAREGI Testbed at KEK 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

32 Go Iwai, KEK/CRC 32 Data storage with Gfarm Test Data Grid part is consist of Gfarm Data Grid part is consist of Gfarm data files are stored in the multiple disk servers under the Gfarm file system data files are stored in the multiple disk servers under the Gfarm file system Input and output data are stage-in and stage-out to the GFarm storage. Input and output data are stage-in and stage-out to the GFarm storage. Gfarm client installed in the Engine nodes (Worker nodes) can get access the data file through program read/write directly with no change in the application program (Belle event simulation). Gfarm client installed in the Engine nodes (Worker nodes) can get access the data file through program read/write directly with no change in the application program (Belle event simulation). 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

33 Go Iwai, KEK/CRC 33 Next Steps for us NAREGI beta-2 released in Oct. 2007 NAREGI beta-2 released in Oct. 2007 Under installation now Under installation now Mostly done, but DG part still not yet Mostly done, but DG part still not yet Features Features “Easy” installation by apt-rpm “Easy” installation by apt-rpm Greatly improved Greatly improved Beta-1: 6 months Beta-1: 6 months Beta-2: 2 hours Beta-2: 2 hours DG part is exclusive DG part is exclusive Interoperation with EGEE/gLite Interoperation with EGEE/gLite Job submission, Data exchange, Information exchange Job submission, Data exchange, Information exchange Various useful features for application Various useful features for application GridMPI: MPI jobs linking over sites GridMPI: MPI jobs linking over sites Burst job submission Burst job submission GridFTP interface to the Gfarm files GridFTP interface to the Gfarm files Perhaps, expect to work with SRM Perhaps, expect to work with SRM Release of NAREGI ver1.0 in Apr-May 2008 Release of NAREGI ver1.0 in Apr-May 2008 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

34 Go Iwai, KEK/CRC 34 Summary (1/2) 2 systems are in operation as production. 2 systems are in operation as production. Main usage: Main usage: KEK-1: R&D KEK-1: R&D KEK-2: high quality services KEK-2: high quality services NAREGI beta1 is installed and tested. NAREGI beta1 is installed and tested. being replaced with beta-2 very soon (this month) being replaced with beta-2 very soon (this month) Other services related on Grid Other services related on Grid KEK Grid CA KEK Grid CA VOMS VOMS 19,628 jobs have been processed, 176,100 CPU hours have been used in KEK last 2 years. 19,628 jobs have been processed, 176,100 CPU hours have been used in KEK last 2 years. SITE # of cores SPEC (kSI2K) Storage (TB) # of jobs CPU (kSI2K*hrs) KEK-11415 0.26/1.41 (18%) 7,67733,140 KEK-24480 0.57/2.03 (28%) 11,951142,960 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

35 Go Iwai, KEK/CRC 35 Summary (2/2) BELLE VO BELLE VO The GridFTP server plugged SRB-DSI is in operation. The GridFTP server plugged SRB-DSI is in operation. Succeed to establish access method of existing data by using GridFTP client from outside. Succeed to establish access method of existing data by using GridFTP client from outside. Installation of belle library remotely, MC production is available at each site. Installation of belle library remotely, MC production is available at each site. ILC/CALICE VO ILC/CALICE VO Triangle file sharing/transfer and job submission each among DESY, IN2P3 and KEK over the VO. Triangle file sharing/transfer and job submission each among DESY, IN2P3 and KEK over the VO. FW FW Shigeo (Yashiro-san) talk about this issue on Thursday Shigeo (Yashiro-san) talk about this issue on Thursday The VO for the Accelerator Science in Japan (ppj) The VO for the Accelerator Science in Japan (ppj) Installed and federated among 6 institutes, non-production sites mostly. Installed and federated among 6 institutes, non-production sites mostly. Developing supporting system. Developing supporting system. Geant4 Medical Application VO Geant4 Medical Application VO Web Interface has been developed. Web Interface has been developed. Job control and getting/viewing results are available only with web browser. Job control and getting/viewing results are available only with web browser. NAREGI beta-2 has been released NAREGI beta-2 has been released We are installing and testing now We are installing and testing now VO # of cores SPEC (kSI2K) Storage (TB) # of jobs CPU (kSI2K*hrs) BELLE8831,260 5.94/22.5 (26%) 12,417220,809 ILC32,79335,384 12.6/68.4 (18%) 150,269323,251 CALICE13,46915,140 15.6/204 (7.6%) 145,776338,531 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK

36 Go Iwai, KEK/CRC 36 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK Items to be continued “KEK new Grid system” given by Koichi Murakami on Wednesday “KEK new Grid system” given by Koichi Murakami on Wednesday CE and WN CE and WN Migration to LSF Migration to LSF Sharing resources with local user and grid user Sharing resources with local user and grid user Most of batch system at KEK are using LSF Most of batch system at KEK are using LSF Most of staffs have many knowhow Most of staffs have many knowhow SE integration SE integration We are using SRM as head node of SE We are using SRM as head node of SE Established to access HSM by using SRB-DSI, hopefully HPSS by using HPSS-DSI soon Established to access HSM by using SRB-DSI, hopefully HPSS by using HPSS-DSI soon We can access to both SRB-DSI and HPSS-DSI via GridFTP, but not use LFC We can access to both SRB-DSI and HPSS-DSI via GridFTP, but not use LFC More robust and higher performance services More robust and higher performance services Using VM Using VM Redundant design Redundant design “Discussion on security issue on GRID system and network operation” given by Shigeo Yashiro on Thursday “Discussion on security issue on GRID system and network operation” given by Shigeo Yashiro on Thursday Networking/Security Networking/Security We are planning to change the network configuration logically for more flexible operation We are planning to change the network configuration logically for more flexible operation

37 37 Go Iwai, KEK/CRC Acknowledge Daily operation Daily operation All members of APROC (ASGC) All members of APROC (ASGC) All of ROCs All of ROCs K. Ishikawa, M. Matsui (ISE Co., Ltd) K. Ishikawa, M. Matsui (ISE Co., Ltd) Belle virtual organization Belle virtual organization K. Inami, M. Kaga (Nagoya Univ.) K. Inami, M. Kaga (Nagoya Univ.) P. Lason (CYFRONET) P. Lason (CYFRONET) J. Shih, M. Tsai (ASGC) J. Shih, M. Tsai (ASGC) M. Rosa, G. Moloney (Univ. of Melbourne) M. Rosa, G. Moloney (Univ. of Melbourne) S. Lee (Korea University) S. Lee (Korea University) ILC/Calice virtual organization ILC/Calice virtual organization R. Poeschl (LAL) R. Poeschl (LAL) A. Miyamoto (KEK) A. Miyamoto (KEK) People in DESY-IT People in DESY-IT Accelerator Science Accelerator Science M. Fujii, Y. Nagasaka (Hiroshima- IT) M. Fujii, Y. Nagasaka (Hiroshima- IT) J. Ebihara, K. Yamada (Soum Co.,Ltd) J. Ebihara, K. Yamada (Soum Co.,Ltd) Y. Takeuchi (Univ. of Tsukuba) Y. Takeuchi (Univ. of Tsukuba) K. Kawagoe (Kobe University) K. Kawagoe (Kobe University) T. Nagamine (Tohoku Univ.) T. Nagamine (Tohoku Univ.) Many thanks for your attention 27 November 2007 Workshop KEK - CC-IN2P3: LCG update and plan at KEK


Download ppt "Workshop KEK - CC-IN2P3 LCG update and plan at KEK Go Iwai, KEK/CRC 27 – 29 Oct. CC-IN2P3, Lyon, France Day1 15:25 - 16:05 (40min)"

Similar presentations


Ads by Google