06/08/10 PBS, LSF and ARC integration Zoltán Farkas MTA SZTAKI LPDS.

Slides:



Advertisements
Similar presentations
PRAGMA BioSciences Portal Raj Chhabra Susumu Date Junya Seo Yohei Sawai.
Advertisements

Building Portals to access Grid Middleware National Technical University of Athens Konstantinos Dolkas, On behalf of Andreas Menychtas.
Generic MPI Job Submission by the P-GRADE Grid Portal Zoltán Farkas MTA SZTAKI.
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann MTA SZTAKI.
Grid Resource Allocation Management (GRAM) GRAM provides the user to access the grid in order to run, terminate and monitor jobs remotely. The job request.
CERN LCG Overview & Scaling challenges David Smith For LCG Deployment Group CERN HEPiX 2003, Vancouver.
P-GRADE and WS-PGRADE portals supporting desktop grids and clouds Peter Kacsuk MTA SZTAKI
EGEE-II INFSO-RI Enabling Grids for E-sciencE Supporting MPI Applications on EGEE Grids Zoltán Farkas MTA SZTAKI.
Porto, January Grid Computing Course Summary of day 2.
1 Application Specific Module for P-GRADE Portal 2.7 Application Specific Module overview Akos Balasko MTA-SZTAKI LPDS
Developing an Application-Specific Portal with P-GRADE Portal 2.9 Author: Ákos Balaskó, Date :
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
1 portal.p-grade.hu További lehetőségek a P-GRADE Portállal Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
CONDOR DAGMan and Pegasus Selim Kalayci Florida International University 07/28/2009 Note: Slides are compiled from various TeraGrid Documentations.
DIANE Overview Germán Carrera, Alfredo Solano (CNB/CSIC) EMBRACE COURSE Monday 19th of February to Friday 23th. CNB-CSIC Madrid.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI WS-PGRADE/gUSE Supporting e-Science communities in Europe Zoltan Farkas.
CloudBroker integration to WS- PGRADE/gUSE Zoltán Farkas MTA SZTAKI LPDS
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
Through the development of advanced middleware, Grid computing has evolved to a mature technology in which scientists and researchers can leverage to gain.
07/06/11 New Features of WS-PGRADE (and gUSE) 2010 Q Q2 Miklós Kozlovszky MTA SZTAKI LPDS.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI Creating the Autodock gateway from WS-PGRADE/gUSE and making it cloud-enabled.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
The EDGeS project receives Community research funding 1 SG-DG Bridges Zoltán Farkas, MTA SZTAKI.
INFSO-RI Enabling Grids for E-sciencE Supporting legacy code applications on EGEE VOs by GEMLCA and the P-GRADE portal P. Kacsuk*,
Introduction to WS-PGRADE and gUSE Tutorial Akos Balasko 04/17/
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
1 P-GRADE Portal: a workflow-oriented generic application development portal Peter Kacsuk MTA SZTAKI, Hungary Univ. of Westminster, UK.
Resource Management Task Report Thomas Röblitz 19th June 2002.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Status report on Application porting at SZTAKI.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Services for advanced workflow programming.
FRANEC and BaSTI grid integration Massimo Sponza INAF - Osservatorio Astronomico di Trieste.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
1 P-GRADE Portal tutorial at EGEE’09 Introduction to hands-on Gergely Sipos MTA SZTAKI EGEE.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI Accessing Cloud Systems from WS-PGRADE/gUSE Zoltán Farkas MTA SZTAKI LPDS.
06/08/10 P-GRADE Portal and MIMOS P-GRADE portal developments in the framework of the MIMOS-SZTAKI joint project Mohd Sidek Salleh MIMOS Berhad Zoltán.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI How to integrate portals with the EGI monitoring system Dusan Vudragovic.
Grid Compute Resources and Job Management. 2 Grid middleware - “glues” all pieces together Offers services that couple users with remote resources through.
EGEE-II INFSO-RI Enabling Grids for E-sciencE P-GRADE overview and introduction: workflows & parameter sweeps (Advanced features)
EGI Technical Forum Amsterdam, 16 September 2010 Sylvain Reynaud.
Application Specific Module Tutorial Zoltán Farkas, Ákos Balaskó 03/27/
1 P-GRADE Portal hands-on Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using WMProxy advanced job submission.
07/02/2012 WS-PGRADE/gUSE in use Lightweight introduction Zoltán Farkas MTA SZTAKI LPDS.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
WS-PGRADE/gUSE in use Advance use of WS- PGRADE/gUSE gateway framework Zoltán Farkas and Peter Kacsuk MTA SZTAKI LPDS.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
Remote Api Tutorial How to call WS-PGRADE workflows from remote clients through the http protocol?
RI EGI-TF 2010, Tutorial Managing an EGEE/EGI Virtual Organisation (VO) with EDGES bridged Desktop Resources Tutorial Robert Lovas, MTA SZTAKI.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
Convert generic gUSE Portal into a science gateway Akos Balasko.
FESR Trinacria Grid Virtual Laboratory Practical using WMProxy advanced job submission Emidio Giorgio INFN Catania.
How to connect your DG to EDGeS? Zoltán Farkas, MTA SZTAKI
Data Bridge Solving diverse data access in scientific applications
GWE Core Grid Wizard Enterprise (
P-GRADE Portal tutorial
Lightweight introduction
Lightweight introduction
Presentation transcript:

06/08/10 PBS, LSF and ARC integration Zoltán Farkas MTA SZTAKI LPDS

06/08/10PBS, LSF and ARC 2 Outline Introduction Requirements PBS and LSF ARC Architecture of P-GRADE Portal runtime layer PBS/LSF integration ARC integration Summary

06/08/10PBS, LSF and ARC 3 Introduction P-GRADE Portal supported gLite, Globus ETHZ requirement: Make use of PBS local clusters Make use of LSF local clusters (Brutus) Sometimes make use of ARC grid resources All this should be integrated within P-GRADE Portal

06/08/10PBS, LSF and ARC 4 PBS (and LSF) Portable Batch Scheduler (Load Sharing Facility) Schedule users' jobs on a cluster Interactive login to a submission node Users execute different commands: qsub (bsub): submit qstat (bjobs): status qdel (bkill): abort Submission Node Cluster node Cluster node Cluster node Cluster node Cluster node Scheduler node

06/08/10PBS, LSF and ARC 5 ARC Advanced Resource Connector Complete grid middleware with: Information system Command-line clients with integrated broker Data management stack (GridFTP) Usable through client programs: Job description: xRSL ngsub: submit ngstat: status update ngkill: cancel ngget: get results

06/08/10PBS, LSF and ARC 6 P-GRADE Portal Architecture Workflow Editor-related components Portlet-related components Workflow data storage Execution layer See next slide!

06/08/10PBS, LSF and ARC P-GRADE Portal Machine Globus Grid EGEE Grid P-GRADE Portal's filesystem User Workflow Data Common workflow and job execution scripts Globus scripts EGEE scripts Apache Tomcat servlet container GridSphere portal framework P-GRADE Portal Portlet DAGMan PBS scripts PBS Cluster PBS Cluster Workflow Editor Servlet Workflow Editor Client P-GRADE Portal Portlet P-GRADE Portal Portlet P-GRADE Portal Portlet P-GRADE Portal Portlet

06/08/10PBS, LSF and ARC 8 LSF and PBS integration I. Principal idea: User should be able to configure a remote ssh connection to submission nodes through the Settings portlet Connection is established using ssh keypairs Established connections are reused in order to minimize ssh connection attempts Connections are used on a: Per-user, Per-resource bassis → a given user's connection isn't accessible by other users → different resources use different connections

06/08/10PBS, LSF and ARC 9 LSF and PBS integration II. Portal Machine Connection Pool User 1 Connection Pool User 2 LSF resource 1 PBS resource 1 LSF resource 3 PBS resource 2 LSF resource 2 PRIV PUB PRIV PUB

06/08/10PBS, LSF and ARC 10 LSF and PBS integration III. Job preparation: wkf_pre_LSF.sh: prepare job, wrapper, collect files wkf_pre_PBS.sh: prepare job, wrapper, collect files Job execution: wkf_LSF.sh: submit and observe job using b* commands wkf_PBS.sh: submit and observer job using q* commands Wrappers: LSF_fake.sh: handle generator and collector jobs, run exe PBS_fake.sh: handle generator and collector jobs, run exe Job post-processing: No real task (wkf_post_LSF.sh and wkf_post_PBS.sh)

06/08/10PBS, LSF and ARC 11 LSF and PBS integration features Full PS support Very quick response time compared to grid middlewares Support for any kind of executable

06/08/10PBS, LSF and ARC 12 ARC integration I. Very similar to the EGEE support An ARC client stack has to be installed on the P- GRADE Portal machine Users can gain access with X.509 proxy certs Two possible resource selections: User can specify the target cluster Cluster can be selected by client broker

06/08/10PBS, LSF and ARC 13 ARC integration II. Job preparation: wkf_pre_nordugrid.sh Wrapper script preparation Generator-related cleanups (as needed) Autogenerator-related file uploads (as needed) Job execution: wkf_nordugrid.sh xRSL prepared based on job properties Job submission and management using ng* commands Wrapper script: manage generator and collector jobs if needed Job post-processing: wkf_post_nordugrid.sh No real job to perform

06/08/10PBS, LSF and ARC 14 ARC integration features Full PS support Offers the possibility to select execution resource Support for any kind of executable Multi-node job support Offers possibility to specify runTimeEnvironment attributes

06/08/10PBS, LSF and ARC 15 Summary PBS, LSF and ARC integration was relatively simple thanks to the pluggable architecture of P- GRADE Portal However, the devil is in the details: Ssh connection sharing + parallel connection limits Proper LSF job cancel …