The Catania Grid Engine Mr. Riccardo Rotondo Consortium GARR, Rome, Italy

Slides:



Advertisements
Similar presentations
CHEP 2000, Roberto Barbera Roberto Barbera (*) GENIUS: a Web Portal for the GRID Meeting Grid.it, Bologna, (*) work in collaboration.
Advertisements

Building Portals to access Grid Middleware National Technical University of Athens Konstantinos Dolkas, On behalf of Andreas Menychtas.
Grid Initiatives for e-Science virtual communities in Europe and Latin America The VRC-driven GISELA Science Gateway Diego Scardaci.
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Data Management Expert Panel - WP2. WP2 Overview.
Catania Grid & Cloud Engine Mario Torrisi Istituto Nazionale di Fisica Nucleare – Sezione di
E-science grid facility for Europe and Latin America A Data Access Policy based on VOMS attributes in the Secure Storage Service Diego Scardaci.
SC7 WG6 Rome Engineering Ingegneria Informatica S.p.A. INFSO-RI Isabel Matranga ETICS Automated Building,Testing and Quality Assurance.
Catania Science Gateway Framework Motivations, architecture, features Catania, 09/06/2014Riccardo Rotondo
Co-ordination & Harmonisation of Advanced e-Infrastructures for Research and Education Data Sharing Research Infrastructures – Proposal n A Standard-based.
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
WP6: Grid Authorization Service Review meeting in Berlin, March 8 th 2004 Marcin Adamski Michał Chmielewski Sergiusz Fonrobert Jarek Nabrzyski Tomasz Nowocień.
Riccardo Bruno INFN.CT Sevilla, Sep 2007 The GENIUS Grid portal.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Grid Engine Riccardo Rotondo
Connecting OurGrid & GridSAM A Short Overview. Content Goals OurGrid: architecture overview OurGrid: short overview GridSAM: short overview GridSAM: example.
A Web 2.0 Portal for Teragrid Fugang Wang Gregor von Laszewski May 2009.
1. Introduction  The JavaScript Grid Portal is trying to find a way to access Grid through Web browser, while using Web 2.0 technologies  The portal.
1.The portal sends, under the user approval, user’s attribute retrieved from IDP to CA bridge 2.CA bridge module requests to a CA-online a certificate.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Riccardo Rotondo
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
GILDA testbed GILDA Certification Authority GILDA Certification Authority User Support and Training Services in IGI IGI Site Administrators IGI Users IGI.
Javascript Cog Kit By Zhenhua Guo. Grid Applications Currently, most grid related applications are written as separate software. –server side: Globus,
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Migrating Desktop Marcin Płóciennik Marcin Płóciennik Kick-off Meeting, Santander, Graphical.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
1 Grid2Win: porting of gLite middleware to Windows Dario Russo INFN Catania
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
Convert generic gUSE Portal into a science gateway Akos Balasko.
EGI-InSPIRE RI EGI-InSPIRE RI User Support in IGI: Related Tools and Services in Italy EGI Technical Forum
WLCG Authentication & Authorisation LHCOPN/LHCONE Rome, 29 April 2014 David Kelsey STFC/RAL.
Widening the number of e-Infrastructure users with Science Gateways and Identity Federations Giuseppe Andronico INFN -
EGI Technical Forum Amsterdam, 16 September 2010 Sylvain Reynaud.
How to integrate EGI portals with Identity Federations Roberto Barbera Univ. of Catania and INFN EGI Technical Forum – Prague,
Rights Management for Shared Collections Storage Resource Broker Reagan W. Moore
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
Tutorial on Science Gateways, Roma, Riccardo Rotondo Introduction on Science Gateway Understanding access and functionalities.
Tutorial on Science Gateways, Roma, Catania Science Gateway Framework Motivations, architecture, features Riccardo Rotondo.
Introduction to Distributed Computing Infrastructures and the Catania Science Gateway Framework Roberto Barbera Univ. of Catania.
Utilizzo di portali per interfacciamento tra Grid e Cloud Workshop della Commissione Calcolo e Reti dell’INFN, May Laboratori Nazionali del.
Co-ordination & Harmonisation of Advanced e-Infrastructures Research Infrastructures – Grant Agreement n The CHAIN project and its worldwide interoperability.
Gang Chen, Institute of High Energy Physics Feb. 27, 2012, CHAIN workshop,Taipei Co-ordination & Harmonisation of Advanced e-Infrastructures Research Infrastructures.
Development of portlets for special jobs: parametric, collections, workflows Mario Torrisi National Institute of Nuclear Physics.
REST API to develop application for mobile devices Mario Torrisi Dipartimento di Fisica e Astronomia – Università degli Studi.
Stato degli Science Gateway di Catania Roberto Barbera Univ. of Catania and INFN Riunione tecnica sul portale general purpose.
The Catania Grid Engine and some implementations of the framework Diego Scardaci INFN The Catania Science Gateway Framework.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) Africa Joint CHAIN/EUMEDGRID- Support/EPIKH School for.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
Antonio Fuentes RedIRIS Barcelona, 15 Abril 2008 The GENIUS Grid portal.
Development of portlets for special jobs: parametric, collections, workflows Mario Torrisi Istituto Nazionale di Fisica Nucleare.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Co-ordination & Harmonisation of Advanced e-INfrastructures CHAIN Worldwide Interoperability Test Roberto Barbera – Univ. of Catania and INFN Diego Scardaci.
Enabling Grids for E-sciencE Claudio Cherubino INFN DGAS (Distributed Grid Accounting System)
Co-ordination & Harmonisation of Advanced e-INfrastructures Technical program: advancement & issues Roberto Barbera University.
Web and mobile access to digital repositories Mario Torrisi National Institute of Nuclear Physics – Division of
Sistema di Autenticazione e Autorizzazione per Science Gateway basato su Shibboleth M. Fargetta Consorzio.
A Data Engine for Grid Science Gateways Enabling Easy Transfers and Data Sharing Dr. Marco Fargetta (1), Mr. Riccardo Rotondo (2,*), Prof. Roberto Barbera.
Co-ordination & Harmonisation of Advanced e-Infrastructures for Research and Education Data Sharing Grant.
Grid2Win Porting of gLite middleware to Windows XP platform
Operations Management Board 19th Dec. 2013
Giuseppe LA ROCCA INFN - Catania, Italy
Grid accounting system
CHAIN-REDS computing solutions for Virtual Research Communities CHAIN-REDS Workshop – 11 December 2013 Roberto Barbera – University of Catania and.
The Catania Science Gateway Framework
GSAF Grid Storage Access Framework
Elisa Ingrà – Consortium GARR
Grid Engine Riccardo Rotondo
Grid Engine Diego Scardaci (INFN – Catania)
Presentation transcript:

The Catania Grid Engine Mr. Riccardo Rotondo Consortium GARR, Rome, Italy

Outline Requirements jSaga The Catania Grid Engine: o Authentication o Job Engine o Data Engine OGF35, Delft (NL), 18 Jun 20122

EGI Portal & Traceability Policies (1/2) 3 Portal Classes Portal ClassExecutableParametersInput Simple one- click provided by portal provided by portal Parameter provided by portal chosen from enumerable and limited set chosen from repository vetted by the portal Data processing provided by portal chosen from enumerable and limited set provided by user Job management provided by user provided by user Science Gateways

The Portal, the VO the Portal is associated to, and the Portal manager are all individually and collectively responsible and accountable for all interactions with the Grid The Portal must be capable of limiting the job submission rate The Portal must keep audit logs for all interactions with the Grid as defined in the Traceability and Logging Policy (minimum 90 days) The Portal manager and operators must assist in security incident investigations Where relevant, private keys associated with (proxy) certificates must not be transferred across a network, not even in encrypted form 4 EGI Portal & Traceability Policies (2/2)

SAGA is made of: SAGA Core Libraries: contain the SAGA base system, the runtime and the API packages (job management, data management, etc.); SAGA Adaptors: provide access to the underlying grid infrastructure (adaptors are available for gLite, ARC, Globus, UNICORE and other middleware); SAGA defines a standard We then need an implementation! A Simple API for Grid Applications (SAGA) Beijing, Asia4,

JSAGA JSAGA is a Java implementation of SAGA developed at CCIN2P3; JSAGA: o Enables uniform data and job management across different grid infrastructures/middleware; o Makes extensions easily: adaptor interfaces are designed to minimize coding effort for integrating support of new technologies/middleware; o Is OS independent: most of the provided adaptors are written in full Java and they are tested both on Windows and Linux. Beijing, Asia4,

JSAGA Adaptors Beijing, Asia4, JSAGA supports gLite, Globus, ARC, UNICORE, etc.

In order to strong reduce the risks to have the robot certificate compromised, the INFN CA decided to store this new certificate on board of the SafeNet eToken smart cards [6]; The AeToken smart card can support many certificates; A token PIN is prompted every time the user needs to interact with the smart card;e-Token 8

Users Client Applications Grid Portals / Science Gateways e-Token Server 9 Host based mutual authentication

The eToken server working scenario OGF35, Delft (NL), 18 Jun eTokenServer MyProxy Server ask for VOMS AC attributes VOMS Server store long proxy (*) SSL encryption get results ask for a service list/create request execute a service get the results back retrieve serials/proxy (*)

User Tracking DB OGF35, Delft (NL), 18 Jun GRID USAGE TRACEABILITY Common NamePortal User Name as stored in LDAP IP + PortIP address and TCP port used by the requester TimestampIdentify the grid operation date/time Grid InteractionGrid Interaction Identification (Job “X” submission, file upload/download). The portal MUST classify all the grid operations allowed. This value will allow to identify both applications used and operation performed. Grid IDStore the actual GRID Interaction ID (Job ID for job submission and some other relevant information for data transfer) Robot CertificateIdentify the Robot Certificate used for the Grid Operation Two Tables: one for active Jobs and File Transfersand one for the finished ones. ID70 Common Namefpistagna IP + TCP Port :8162 Timestamp :16:29 Grid Interaction1 Grid ID[wms://infn-wms-01.ct.pi2s2.it:7443/glite_wms_wmproxy_server]- [ Robot Certificate/C=IT/O=INFN/OU=Robot/L=COMETA/CN=Robot: ViralGrid Science Gateway - Roberto Barbera Virtual Organisationcometa Example of entry in the Users Tracking DB

The Grid Engine 12 Grid Engine Users Tracking DB Science GW Interface SAGA/JSAGA API Job Engine Data Engine Users Track & Monit. Science GW 1 Science GW 2 Science GW 3 Grid MWs Liferay Portlets eToken Server DONE By end of JuneDONE

Job Engine Architecture OGF35, Delft (NL), 18 Jun WT Worker Threads for Job Submission WT Worker Threads for Status Checking USER TRACKING DB MONITORING MODULE GRID INFRASTRUCTURE(S) Job Queue WT Job Submission Job Check Status/ Get Output

Job Engine Features OGF35, Delft (NL), 18 Jun FeatureDescriptionStatus Middleware Independent Capacity to submit job to resources running different middleware DONE EasinessCreate code to run applications on the grid in a very short time DONE ScalabilityManage a huge number of parallel job submissions fully exploiting the HW of the machine where the Job Engine is installed DONE PerformanceHave a good response timeDONE Accounting & Auditing Register every grid operation performed by the users DONE Fault ToleranceHide middleware failure to end usersALMOST DONE WorkflowProviding a way to easily create and run workflows IN PROGRESS

Job Engine Scalability OGF35, Delft (NL), 18 Jun Submission time scales linearly with number of jobs >10,000 jobs a hour 15 40,000 jobs submitted in parallel ! Time to submit 10,000 jobs (h) Job submission time (h)

Job Engine Interoperability Interoperability is a property referring to the ability of diverse systems and organizations to work together (inter-operate). The term is often used in a technical systems engineering sense, or alternatively in a broad sense, taking into account social, political, and organizational factors that impact system to system performance; According to ISO/IEC (Information Technology Vocabulary, Fundamental Terms), interoperability is "The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units". OGF35, Delft (NL), 18 Jun

gLite-based e-Infrastructures/Projects EUAsiaGrid EUChinaGRID EU-IndiaGrid EUMEDGRID GISELA IGI (Italy) SAGrid (South Africa) 17 Job Engine Interoperability

18 MyJobsMap (1/3)

19 MyJobsMap (2/3)

20 Both sequential and MPI-enabled jobs successfully executed The CHAIN project is preparing a demo of worldwide interoperability among gLite, Unicore, OurGrid, GOS, and GARUDA to be presented at the next EGI Technical Forum MyJobsMap (3/3)

Usage Workflow Bogotá, Sign in eTokenServer User Track- ing DB 3. Proxy request 4. Proxy transfer 5. Grid Interactions 5. Tracking 2. Grid Request 6. Getting Results

Data Enginge Requirements Grid Storage complexity hidden to end users o Users move files from/to a portal and see it as simple external storage accessible from a web interface and do not care about grid (or any other) technologies behind File management smoothly integrated with all the services provided in the SG Underlining architecture exposes a file-system- like view (i.e., a Virtual File System or VFS) through which users can perform the following actions: o Create, move, delete files/directories with the desired structure o Share files with other users o Set the number of backup copies desired EGICF 2012, Munich22

Data On Grid Services: DOGS A file browser shows Grid files in a tree File system exposed by the SG is virtual Easy transfer from/to Grid (by SG) is done in a few clicks Users do not need to care about how and where their files are really located EGICF 2012, Munich23

Upload workflow EGICF 2012, Munich24 1. Sign in eTokenServer User Tracking DB DOGS DB 5. File Upload 3. Proxy request 4. Proxy transfer 7. Update DB 6. Upload on Grid 7. Tracking 2. Upload request

EGICF 2012, Munich25 Back-end technical details JSAGA API used to transfer data from/to storage elements Hibernate to manage the VFS collecting information on files stored on Grid; any changes/actions in the user view affect the VFS MySQL as underlying RDBMS An additional component has been developed in order to keep track of each transaction in the users tracking DB (to be compliant with the EGI Portal and User Traceability Policies)

Front-end technical details A portlet has been created to be deployed in a Liferay-based portal to which access is provided only to federated users with given roles and privileges o The portlet view component includes elFinder, a web-based file manager developed in Javascript using jQuery UI for a dynamic and user friendly interface o EGICF 2012, Munich26

The Data Engine in action EGICF 2012, Munich27

The Data Engine in action EGICF 2012, Munich28 «Share» to be added soon

The Data Engine in action EGICF 2012, Munich29

The Data Engine in action EGICF 2012, Munich30

Thank you for your kind attention OGF35, Delft (NL), 18 Jun