National Institute of Advanced Industrial Science and Technology Developing Scientific Applications Using Standard Grid Middleware Hiroshi Takemiya Grid.

Slides:



Advertisements
Similar presentations
Building a CFD Grid Over ThaiGrid Infrastructure Putchong Uthayopas, Ph.D Department of Computer Engineering, Faculty of Engineering, Kasetsart University,
Advertisements

Kento Aida, Tokyo Institute of Technology Grid Working Group Meeting Aug. 27 th, 2003 Tokyo Institute of Technology Kento Aida.
National Institute of Advanced Industrial Science and Technology Status report on the large-scale long-run simulation on the grid - Hybrid QM/MD simulation.
National Institute of Advanced Industrial Science and Technology Running flexible, robust and scalable grid application: Hybrid QM/MD Simulation Hiroshi.
A Proposal of Capacity and Performance Assured Storage in The PRAGMA Grid Testbed Yusuke Tanimura 1) Hidetaka Koie 1,2) Tomohiro Kudoh 1) Isao Kojima 1)
Gfarm v2 and CSF4 Osamu Tatebe University of Tsukuba Xiaohui Wei Jilin University SC08 PRAGMA Presentation at NCHC booth Nov 19,
National Institute of Advanced Industrial Science and Technology Ninf-G - Core GridRPC Infrastructure Software OGF19 Yoshio Tanaka (AIST) On behalf.
Experiences with GridWay on CRO NGI infrastructure / EGEE User Forum 2009 Experiences with GridWay on CRO NGI infrastructure Emir Imamagic, Srce EGEE User.
Does the implementation give solutions for the requirements? Flexibility GridRPC enables dynamic join/leave of QM servers. GridRPC enables dynamic expansion.
Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs.
Three types of remote process invocation
Advanced Industrial Science and Technology 10 Aug / Globus Retreat Ninf-G: Grid RPC system based on the Globus Toolkit Yoshio Tanaka (AIST, Japan)
COM vs. CORBA.
Setting up Small Grid Testbed
RPC Robert Grimm New York University Remote Procedure Calls.
Developing an Agricultural Monitoring System from Remote Sensing Data Using GridRPC on Ninf-G Shamim Akther, Yann Chemin, Honda Kiyoshi Asian Institute.
GridRPC Sources / Credits: IRISA/IFSIC IRISA/INRIA Thierry Priol et. al papers.
Condor-G: A Computation Management Agent for Multi-Institutional Grids James Frey, Todd Tannenbaum, Miron Livny, Ian Foster, Steven Tuecke Reporter: Fu-Jiun.
A Computation Management Agent for Multi-Institutional Grids
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
USING THE GLOBUS TOOLKIT This summary by: Asad Samar / CALTECH/CMS Ben Segal / CERN-IT FULL INFO AT:
A Grid Parallel Application Framework Jeremy Villalobos PhD student Department of Computer Science University of North Carolina Charlotte.
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
National Institute of Advanced Industrial Science and Technology ApGrid: Current Status and Future Direction Yoshio Tanaka (AIST)
Application-specific Tools Netsolve, Ninf, and NEOS CSE 225 Chas Wurster.
Middleware Technologies compiled by: Thomas M. Cosley.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
National Institute of Advanced Industrial Science and Technology Introduction to Grid Activities in the Asia Pacific Region jointly presented by Yoshio.
1 GRID D. Royo, O. Ardaiz, L. Díaz de Cerio, R. Meseguer, A. Gallardo, K. Sanjeevan Computer Architecture Department Universitat Politècnica de Catalunya.
Grids and Globus at BNL Presented by John Scott Leita.
Computer Science and Engineering A Middleware for Developing and Deploying Scalable Remote Mining Services P. 1DataGrid Lab A Middleware for Developing.
What is Concurrent Programming? Maram Bani Younes.
Speaker: Xin Zuo Heterogeneous Computing Laboratory (HCL) School of Computer Science and Informatics University College Dublin Ireland International Parallel.
Grid ASP Portals and the Grid PSE Builder Satoshi Itoh GTRC, AIST 3rd Oct UK & Japan N+N Meeting Takeshi Nishikawa Naotaka Yamamoto Hiroshi Takemiya.
Jaeki Song ISQS6337 JAVA Lecture 16 Other Issues in Java.
The Japanese Virtual Observatory (JVO) Yuji Shirasaki National Astronomical Observatory of Japan.
Kento Aida, Tokyo Institute of Technology Grid Challenge - programming competition on the Grid - Kento Aida Tokyo Institute of Technology 22nd APAN Meeting.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
FALL 2005CSI 4118 – UNIVERSITY OF OTTAWA1 Part 4 Other Topics RPC & Middleware.
1 Chapter 38 RPC and Middleware. 2 Middleware  Tools to help programmers  Makes client-server programming  Easier  Faster  Makes resulting software.
NeSC Apps Workshop July 20 th, 2002 Customizable command line tools for Grids Ian Kelley + Gabrielle Allen Max Planck Institute for Gravitational Physics.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
GRAM5 - A sustainable, scalable, reliable GRAM service Stuart Martin - UC/ANL.
Routine-Basis Experiments in PRAGMA Grid Testbed Yusuke Tanimura Grid Technology Research Center National Institute of AIST.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
ICS-FORTH 25-Nov Infrastructure for Scalable Services Are we Ready Yet? Angelos Bilas Institute of Computer Science (ICS) Foundation.
Supporting Molecular Simulation-based Bio/Nano Research on Computational GRIDs Karpjoo Jeong Konkuk Suntae.
Data types Function handle – grpc_function_handle_t A structure that contains a mapping between a client and an instance of a remote function Object handle.
July 11-15, 2005Lecture3: Grid Job Management1 Grid Compute Resources and Job Management.
REQUEST/REPLY COMMUNICATION
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
The EDGeS project receives Community research funding 1 Porting Applications to the EDGeS Infrastructure A comparison of the available methods, APIs, and.
National Institute of Advanced Industrial Science and Technology ApGrid: Asia Pacific Partnership for Grid Computing - Introduction of testbed development.
CPSC 171 Introduction to Computer Science System Software and Virtual Machines.
Department of Computer Science and Software Engineering
Globus Grid Tutorial Part 2: Running Programs Across Multiple Resources.
1 Chapter 38 RPC and Middleware. 2 Middleware  Tools to help programmers  Makes client-server programming  Easier  Faster  Makes resulting software.
Group Mission and Approach To enhance Performance and Productivity in programming complex parallel applications –Performance: scalable to thousands of.
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
Globus: A Report. Introduction What is Globus? Need for Globus. Goal of Globus Approach used by Globus: –Develop High level tools and basic technologies.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Kento Aida, Tokyo Institute of Technology Ninf-G: GridRPC System Tokyo Institute of Technology Kento Aida.
Advanced Industrial Science and Technology / Tokyo Institute of Technology July 19, 2000 at ANL Implementation of Ninf using Globus Toolkit -- Ninf-G Hidemoto.
Enabling Grids for E-sciencE University of Perugia Computational Chemistry status report EGAAP Meeting – 21 rst April 2005 Athens, Greece.
Threads vs. Events SEDA – An Event Model 5204 – Operating Systems.
What is Concurrent Programming?
Cluster Computing and the Grid, Proceedings
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

National Institute of Advanced Industrial Science and Technology Developing Scientific Applications Using Standard Grid Middleware Hiroshi Takemiya Grid Technology Research Center National Institute of Advanced Industrial Science and Technology

Examining and evaluating the effectiveness of grid middleware by gridifying “ real ” applications Computational Grid becomes feasible for running apps. Many kinds of middleware has been provided Several pioneering works succeeded in running apps. Little information on how to “ gridify ” legacy apps. How easily the application can be gridified? How efficiently the application can be executed on the grid? The information will be valuable for; application programmers + middleware providers Goal of the research Grid infrastructure Globus Toolkit, UNICORE, MPICH-G, Ninf-G … Climate simulation, Molecular Simulation, Fluid simulation, Virtual Screening, Astronomical Virtual Observatory …

National Institute of Advanced Industrial Science and Technology Weather Forecasting System Predicting short- to middle-term weather change Based on Barotropic S-model Proposed by Dr. Tanaka Legacy FORTRAN program Simple and precise Treating vertically averaged quantities Solving shallow water equations 2D simulation 150 sec for 100 days prediction/1 simulation Succeeded in reproducing characteristic phenomenon Distribution of jet streams Blocking phenomenon of high atmospheric pressure 1989/1/30-2/12

National Institute of Advanced Industrial Science and Technology Ensemble Simulation Keep high precision over long period Taking a statistical ensemble mean Introducing perturbation for each simulation Requires100 ~ 1000 sample simulations perturbation Observational data Sample simulation 2 Sample simulation N Sample simulation 1 Time evolution (Leap Frog method) Statistical result perturbation Time evolution (Leap Frog method) … 100 ~ 1000 simulations Time evolution (Leap Frog method) Gridifying the program enables quick response

National Institute of Advanced Industrial Science and Technology Assigning independent tasks on distributed resources Ninf-G Library at a glance Reference implementation of GridRPC Standardized at GGF Grid RPC WG Providing RPC semantics on the grid Suited for gridifying task parallel programs Providing asynchronous RPC call as well Built on top of Globus Toolkit GRAM: invocation of server programs Globus I/O, GASS: communication between C-S MDS: managing stub information GSI: authentication between C-S Server side Client side Client GRAM 3. invoke Executable 4. connect back Numerical Library IDL Compiler Remote Executable 1. interface request 2. interface reply fork MDS Interface Information LDIF File retrieve IDL FILE generate GSI Globus I/O GASS

National Institute of Advanced Industrial Science and Technology Advantages of using Ninf-G library (1) Construct grid applications easily Hiding complicated mechanism of the grid Based on familiar RPC semantics Providing support tools for programming ns_client_gen: client program compiler ns_gen: stub generator Server sideClient side Client GRAM 3. invoke Executable 4. connect back Server source IDL Compiler Remote Executable 1. interface request 2. interface reply invoke MDS Interface Information LDIF File retrieve IDL FILE generate GSI Globus I/O GASS Ninf-G Client Library Server stub Client ProgramServer Program main(){ for(i = 0; i < task_no; i++) grpc_call_async (dest[i], “remote_call”, args); grpc_wait_all(dest); } remote_call(){ Processing(); Return; } Requires no detailed knowledge on Grid infrastructure

National Institute of Advanced Industrial Science and Technology Advantages of using Ninf-G library (2) Write once, run everywhere Based on standard API Many libraries (NetSolve, DIET, OmniRPC … ) with the same API cf. MPI vs. NX library Constructed on the most popular grid middleware Robust and flexible applications can be constructed Dynamic server program invocation Recovering failures in establishing connections, allocating tasks

National Institute of Advanced Industrial Science and Technology Gridifying the Weather Forecasting System Gridifying 2 parts of the program Simulation part Visualization part Executed in a pipelined fashion Accessing through the GridLib portal Reading Data Averaging results Solving Equations S-model Program Ninf-g Solving Equations Visualizing Results Grid Lib user Ninf-g

National Institute of Advanced Industrial Science and Technology Behavior of the System client server Grid Lib user Globus-job-run Gass-url-copy grpc_call GridLib Sim. Server Vis. Server

National Institute of Advanced Industrial Science and Technology Current Status of the System Deployed on the ApGrid Testbed Client program: cluster in AIST Server program: 10 clusters in 9 institutes Succeeded in giving a demonstration CCGrid Conference: using 110 CPU ’ s PRAGMA Workshop: using 183 CPU ’ s KU , Thailand 15 CPUs Doshisha U, Japan 16 CPUs Weather Simulation Server Program NCHC, Taiwan 16 CPUs Weather Simulation Server Program TITECH, Japan 16 CPUs Weather Simulation Server Program AIST , Japan 64 CPUs Weather Simulation Server Program Visualization Program Weather Simulation Server Program AIST , Japan 10 CPUs Weather Simulation Client Program GridLib HKU, HK 32 CPUs Weather Simulation Server Program KISTI, Korea 64 CPUs Weather Simulation Server Program APGrid Testbed TSUKUBA U, Japan 16 CPUs Weather Simulation Server Program Osaka U, Japan 156 CPUs Weather Simulation Server Program TITECH, Japan 16 CPUs Weather Simulation Server Program

National Institute of Advanced Industrial Science and Technology Lessons Learned (1) How easily the application could be gridified? Gridifying the app using Ninf-G was very easy! 13 days work, 300 lines modification Most of the program modification can be performed on a single computer. Replacing a local call to remote one is straightforward. Creating Server program is automated. 1.Specifying the interface of a server function. 2.Eliminating data dependence between C-S. 3.Inserting Ninf-G functions into the client program. 4.Creating server stubs 10 days work On a single computer On a grid environment 3 days work

National Institute of Advanced Industrial Science and Technology Lessons Learned (2) How was the performance of the application? Attaining good performance is difficult Initialization/termination cost are large Modification of application, as well as middleware is needed Large look-up cost of MDS Long polling period of GRAM Blocking of Ninf-G initialization/termination function elapsed time (sec) Termination Initialization AIST KISTI DU

National Institute of Advanced Industrial Science and Technology Performance Result of Executing 200 Sample Simulations Executing an optimized program Bypassing MDS lookup Modifying GRAM source to reducing polling period Multi-threading application to avoid blocking … elapsed time (sec) AIST KISTI DU

National Institute of Advanced Industrial Science and Technology Future Work Middleware level Improving Ninf-G based on knowledge gained through the work Ninf-G2 will be released in Nov Designing task farming API on Ninf-G difficult to implement efficient, robust, flexible apps Hiding scheduling, error recovery, multi-threading mechanisms Application level Checking the scalability of middleware/apps. Using more than 500 CPUs

National Institute of Advanced Industrial Science and Technology