INFSO-RI-031688 Enabling Grids for E-sciencE Charon Extension Layer Petr Kulhánek, 1,2 Martin Petřek, 1,2 Jan Kmuníček 1,3

Slides:



Advertisements
Similar presentations
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
Advertisements

INFSO-RI Enabling Grids for E-sciencE CHARON System  Petr Kulhánek, 1,2 Jan Kmuníček,
EGEE-II INFSO-RI Enabling Grids for E-sciencE Supporting MPI Applications on EGEE Grids Zoltán Farkas MTA SZTAKI.
ProActive Task Manager Component for SEGL Parameter Sweeping Natalia Currle-Linde and Wasseim Alzouabi High Performance Computing Center Stuttgart (HLRS),
11.1 © 2004 Pearson Education, Inc. Exam Planning, Implementing, and Maintaining a Microsoft Windows Server 2003 Active Directory Infrastructure.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
Linux Operations and Administration
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Ab initio grid chemical software ports – transferring.
SA1 / Operation & support Enabling Grids for E-sciencE Integration of heterogeneous computational resources in.
Rsv-control Marco Mambelli – Site Coordination meeting October 1, 2009.
An Introduction to Software Architecture
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
11 MANAGING AND DISTRIBUTING SOFTWARE BY USING GROUP POLICY Chapter 5.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Configuring and Maintaining EGEE Production.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America GENIUS server installation and configuration.
INFSO-RI Enabling Grids for E-sciencE The GENIUS Grid portal Tony Calanducci INFN Catania - Italy First Latin American Workshop.
GRID Computing: Ifrastructure, Development and Usage in Bulgaria M. Dechev, G. Petrov, E. Atanassov.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
Computational grids and grids projects DSS,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
1 Chapter Overview Publishing Resources in Active Directory Service Redirecting Folders Using Group Policies Deploying Applications Using Group Policies.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Provenance Challenge gLite Job Provenance.
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks, An Overview of the GridWay Metascheduler.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Status report on Application porting at SZTAKI.
Interactive Workflows Branislav Šimo, Ondrej Habala, Ladislav Hluchý Institute of Informatics, Slovak Academy of Sciences.
INFSO-RI Enabling Grids for E-sciencE Graphical User Interface. for Charon Extension Layer System. and Application Dashboards Jan.
EGEE is a project funded by the European Union under contract IST “Interfacing to the gLite Prototype” Andrew Maier / CERN LCG-SC2, 13 August.
INFSO-RI Enabling Grids for E-sciencE CHARON System Jan Kmuníček, Petr Kulhánek, Martin Petřek CESNET, Czech Republic.
INFSO-RI Enabling Grids for E-sciencE Ganga 4 – The Ganga Evolution Andrew Maier.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Using GStat 2.0 for Information Validation.
INFSO-RI Enabling Grids for E-sciencE Charon Extension Layer. Modular environment for Grid jobs and applications management Jan.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
INFSO-RI Enabling Grids for E-sciencE Use Case of gLite Services Utilization. Multiple Ligand Trajectory Docking Study Jan Kmuníček.
INFSO-RI Enabling Grids for E-sciencE EGEE is a project funded by the European Union under contract IST Job sandboxes.
EGEE is a project funded by the European Union under contract IST Package Manager Predrag Buncic JRA1 ARDA 21/10/04
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks CharonGUI A Graphical Frontend on top of.
INFSO-RI Enabling Grids for E-sciencE gLite Certification and Deployment Process Markus Schulz, SA1, CERN EGEE 1 st EU Review 9-11/02/2005.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using WMProxy advanced job submission.
STAR Scheduler Gabriele Carcassi STAR Collaboration.
INFSO-RI Enabling Grids for E-sciencE Ganga 4 Technical Overview Jakub T. Moscicki, CERN.
INFSO-RI Enabling Grids for E-sciencE Utilizing E-SciencE through Charon Extension Layer (CEL) Toolkit Petr Kulhánek, Martin Petřek,
Maven. Introduction Using Maven (I) – Installing the Maven plugin for Eclipse – Creating a Maven Project – Building the Project Understanding the POM.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Overview of gLite, the EGEE middleware Mike Mineter Training Outreach Education National.
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarksEGEE-III INFSO-RI MPI on the grid:
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Computational Chemistry Cluster – EGEE-III.
Workload Management System ( WMS )
StratusLab Final Periodic Review
StratusLab Final Periodic Review
Porting MM5 and BOLAM codes to the GRID
VOCE Peter Kaczuk, Dan Kouril, Miroslav Ruda, Jan Svec,
Austria, Czech Republic, Hungary, Poland, Slovakia
EGEE Middleware: gLite Information Systems (IS)
Presentation transcript:

INFSO-RI Enabling Grids for E-sciencE Charon Extension Layer Petr Kulhánek, 1,2 Martin Petřek, 1,2 Jan Kmuníček 1,3 1) CESNET z. s. p. o., Zikova 4, CZ Praha, Czech Republic 2) National Centre for Biomolecular Research, Faculty of Science, Masaryk University, Kotlářská 2, CZ Brno, Czech Republic 3) Institute of Computer Science, Masaryk University, Botanická 68a, CZ Brno, Czech Republic

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Contents Introduction Charon Infrastructure Application Management –Module System –Applications in GRID Environment Job Management –Charon System –Sites Conclusions Acknowledgments

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Introduction What is Charon? –uniform and modular approach for (complex) computational jobs submission and management –generic system for use of application programs in the Grid environment (LCG/gLite middleware, …) Why Charon? –many various batch systems & scheduling components used in grid environment –each batch system has unique tools and different philosophy of its utilization –LCG/gLite provided tools are quite raw and simple –many additional tasks to use computer resources properly

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, CHARON Infrastructure Application management –single/parallel execution without job script modification Job management –easy job submission, monitoring, and result retrieving Command Line Interface (CLI) approach

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Application management Requirements –easy application initialization –version conflict handling –inter-application conflicts and/or dependencies handling –same usage in single/parallel execution –different levels of parallelizations Module System

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Module System –similar approach as in Environment Modules Project*  applications are activated by the modifications of shell environment (e.g. PATH, LD_LIBRARY_PATH, etc.) –particular build of application is described by realization (e.g. by instructions, which describe shell environment modifications) –realization is identified by name consisting from four parts: name[:version[:architecture[:parallelmode]]] –user can specify only part of realization, in that case, module system completes full name of realization in such a way that the application will best fit available computational resources *)

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Module System continued Commands of Module System module [action] [module1 [module2] …]  main command of Module System actions: oadd (load), remove (unload) oavail, list*, active, exported, versions, realizations odisp, isactive * list is default action modconfig  menu driven configuration of Module System (vizualizations, autorestored modules, etc.)

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Module System continued Example of Module Activation $ module add povray Module specification: povray (add action) ============================================================== WARNING: Nonoptimal architecture is used for module 'povray' Cache type : system cache Architecture : i786 Number of CPUs : 1 Max CPUs per node : 1 Exported module : povray:3.6 Complete module : povray:3.6:i386:single Module Name Completion povray → povray:3.6:auto:auto → povray:3.6:i386:single user → default values → resolved final name

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Module System continued Module Name Completion name - specified by user (it is mandatory) version - specified by user / default architecture - specified by user / default / automatically determined  Module System tries to find such realization, which is the closest to system architecture parallelmode - specified by user / default / automatically determined  para - always  p4 - NCPU > MaxCPUs/node  shmem - 1 < NCPU <= MaxCPUs/node  node - NCPU <= MaxCPUs/node  single - NCPU=1

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Applications in GRID Model I - METACentrum (Czech national GRID) Model II – EGEE GRID Applications are on shared volume available to all grid elements Legend: UI- user interface CE- computing element SE- storage element WN - worker node app - application applications cannot be shared with all grid elements their “sharing” is provided by their deployment to SE (once time) only required applications are then installed on CE during every job execution

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Applications in GRID continued Advantages all applications are available in whole grid immediately after their deployment to SE Drawbacks this approach is suitable only for middle and long term jobs both modes can be carried out without any modifications of module system desired “non-standard” functions can be carried out by user(administrator) defined hooks (so called modactions) modaction is script that is executed during any action of module command modaction script for add action solves problems with applications in Model II  it behaves differently on UI and WN  it activates applications from volume on UI  it downloads package from SE to WN (CE) and installs it to some temporary volume then Module System sets environment in such a way that application will be used from that volume

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Job management Requirements –easy job submission –user should focus only on desired task not to all things related to submissions –easy parallel executions of applications –often repeated things should be process automatically –keep information about job during execution and/or after execution Charon System

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Charon System Overview –it is application in the context of Module System –it separates resources settings from job submission Job Submission and Management –psubmit [NCPU] [syncmode] –pinfo –psync –pkill –pgo (does not work in EGEE GRID environment) Charon Setup –pconfigure

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Charon System continued Typical job flow No additional arguments are required – all information about job is stored in control files in job directory.

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Charon System continued Job Restrictions –job is described by script* –each job has to be in separate directory – control files need to be unique –job directories must not overlap – because job directory is copied to WN and then back –only relative paths to job directory contents have to be used in job script – only data from job directory will be present on WN –software should be activated by Module System – only then best utilization of resources can be reached Job autodetection* –in some cases, user can specified input file instead of script and Charon System will prepare script for its processing –currently autodetected jobs are: gaussian, povray, and precycle

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Charon System continued Configuration –Sync Mode – option for data transfer between UI and WN  gridcopy all data within job directory as input all data within job directory as result  stdout all data within job directory as input only standard output as result (other data are discarded) –Resources – identification of particular CE –Properties – fine grained selection of computational resources (throught Requirements item in JDL) –Alias – uniform combination of above setup items in a singleword pconfigure command serves for configuration. It is menu driven.

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Sites –sites represent approach in utilizations of different grids (sites) from one computer –sites are special modules within Module System –all sites shared the same software repository but list of available applications depend on site Module System setup

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Conclusions Single job management –encapsulation of a single computational job –minimization of overhead resulting from direct middleware usage(JDL file preparation, etc.) –easy submission and navigation during job lifetime Application management –powerful software management and administration –comfortable enlargement of available application portfolio

Enabling Grids for E-sciencE INFSO-RI Charon Extension Layer, EGEE and SEEGRID-2 Summer School, Budapest, Hungary, 3-8 July, Acknowledgements –Luděk Matyska (CESNET, ICS) –Jaroslav Koča (NCBR) –European Commission  EGEE II (contract number RI )  EGEE (contract number IST ) –Ministry of Education, Youth, and Physical Training of the Czech Republic (contract number MSM ) –Grant Agency of Czech Republic (204/03/H016)