SZTAKI Desktop Grid P. Kacsuk MTA SZTAKI www.lpds.sztaki.hu.

Slides:



Advertisements
Similar presentations
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks MyProxy and EGEE Ludek Matyska and Daniel.
Advertisements

DC-API: Unified API for Desktop Grid Systems Gábor Gombás MTA SZTAKI.
1 From Grids to Service-Oriented Knowledge Utilities research challenges Thierry Priol.
1 P-GRADE Portal and GEMLCA Legacy Code Architecture Peter Kacsuk MTA SZTAKI
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann MTA SZTAKI.
P-GRADE and WS-PGRADE portals supporting desktop grids and clouds Peter Kacsuk MTA SZTAKI
Introduction to Grids and Grid applications Peter Kacsuk and Gergely Sipos MTA SZTAKI
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
T-FLEX DOCs PLM, Document and Workflow Management.
WS-PGRADE: Supporting parameter sweep applications in workflows Péter Kacsuk, Krisztián Karóczkai, Gábor Hermann, Gergely Sipos, and József Kovács MTA.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
Tunis, Tunisia, June 2012 Cloud Research Activities Pr. Mohamed JEMNI Computing Center Al Khawarizmi (CCK) Research Laboratory LaTICE
1 Application Specific Module for P-GRADE Portal 2.7 Application Specific Module overview Akos Balasko MTA-SZTAKI LPDS
6/1/2001 Supplementing Aleph Reports Using The Crystal Reports Web Component Server Presented by Bob Gerrity Head.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
SZTAKI Desktop Grid – a Hierarchical Desktop Grid System P. Kacsuk, A. Marosi, J. Kovacs, Z. Balaton, G. Gombas, G. Vida, A. Kornafeld MTA SZTAKI
1 port BOSS on Wenjing Wu (IHEP-CC)
The EDGeS project receives Community research funding 1 Specific security needs of Desktop Grids Desktop Grids Desktop Grids EDGeS project EDGeS project.
A Lightweight Platform for Integration of Resource Limited Devices into Pervasive Grids Stavros Isaiadis and Vladimir Getov University of Westminster
A Distributed Computing System Based on BOINC September - CHEP 2004 Pedro Andrade António Amorim Jaime Villate.
The EDGI project receives Community research funding 1 EDGI Brings Desktop Grids To Distributed Computing Interoperability Etienne URBAH
A DΙgital Library Infrastructure on Grid EΝabled Technology ETICS Usage in DILIGENT Pedro Andrade
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting October 10-11, 2002.
1 IDGF International Desktop Grid Federation How can you benefit from joining IDGF? Hannover, Peter Kacsuk, MTA SZTAKI, EDGI.
Introduction to SHIWA Technology Peter Kacsuk MTA SZTAKI and Univ.of Westminster
The EDGeS project receives Community research funding 1 SG-DG Bridges Zoltán Farkas, MTA SZTAKI.
The EDGeS project receives Community research funding 1 Bridging EGEE to BOINC and XtremWeb GIN : From interoperation to interoperability.
INFSO-RI Enabling Grids for E-sciencE Supporting legacy code applications on EGEE VOs by GEMLCA and the P-GRADE portal P. Kacsuk*,
WS-PGRADE portal and its usage in the CancerGrid project M. Kozlovszky, P. Kacsuk Computer and Automation Research Institute of the Hungarian Academy of.
1 Advanced features of the P-GRADE portal Peter Kacsuk, Gergely Sipos Peter Kacsuk, Gergely Sipos MTA.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The EDGeS project receives Community research funding XtremWeb-HEP & EGEE CSST-HUST-Wuhan Octobre 11-15th, 2010 Oleg Lodygensky - LAL -
1 P-GRADE Portal: a workflow-oriented generic application development portal Peter Kacsuk MTA SZTAKI, Hungary Univ. of Westminster, UK.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
The EDGeS project receives Community research funding 1 Porting Applications to the EDGeS Infrastructure A comparison of the available methods, APIs, and.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Services for advanced workflow programming.
A PanDA Backend for the Ganga Analysis Interface J. Elmsheuser 1, D. Liko 2, T. Maeno 3, P. Nilsson 4, D.C. Vanderster 5, T. Wenaus 3, R. Walker 1 1: Ludwig-Maximilians-Universität.
1 IDGF International Desktop Grid Federation How can you benefit from joining IDGF? Lyon, Peter Kacsuk, MTA SZTAKI, EDGI is.
A scalable and flexible platform to run various types of resource intensive applications on clouds ISWG June 2015 Budapest, Hungary Tamas Kiss,
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Structural Biology on the GRID Dr. Tsjerk A. Wassenaar Biomolecular NMR - Utrecht University (NL)
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Workflow repository, user.
1 Practical information for the GEMLCA / P-GRADE hands-on Tamas Kiss University of Westminster.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI How to integrate portals with the EGI monitoring system Dusan Vudragovic.
11 Introduction to EDGI Peter Kacsuk, MTA SZTAKI Start date: Duration: 27 months EDGI.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
The EDGeS project receives Community research funding 1 The EDGeS project: Enabling Desktop Grids for e-Science P. Kacsuk MTA SZTAKI.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
The EDGeS project receives Community research funding 1 Support services for desktop grids and service grids by the EDGeS project Tamas Kiss – University.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
RI EGI-TF 2010, Tutorial Managing an EGEE/EGI Virtual Organisation (VO) with EDGES bridged Desktop Resources Tutorial Robert Lovas, MTA SZTAKI.
11 Extending EMI middleware with DGs Peter Kacsuk, MTA SZTAKI Start date: Duration:
1 Globe adapted from wikipedia/commons/f/fa/ Globe.svg IDGF-SP International Desktop Grid Federation - Support Project SZTAKI.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Introduction to Grid and Grid applications Peter Kacsuk MTA SZTAKI
PLM, Document and Workflow Management
How to connect your DG to EDGeS? Zoltán Farkas, MTA SZTAKI
Introduction to Grid computing and the EGEE project
Peter Kacsuk MTA SZTAKI
THE STEPS TO MANAGE THE GRID
Module 01 ETICS Overview ETICS Online Tutorials
Future EU Grid Projects
Presentation transcript:

SZTAKI Desktop Grid P. Kacsuk MTA SZTAKI

Desktop Grid model Internet/Intranet Dynamic resource donation Work package distribution Company/ univ. server Donor: Company/ univ. or private PC Donor: Company/ Univ. or private PC Donor: Company/ univ. or private PC Application

Characteristics of the desktop Grid model Anybody can donate resources Heterogeneous resources, that dynamically join and leave One or a small number of projects can use the resources Asymmetric relationship between donors and users: U << D Advantage: Donating a PC is extremely easy Setting up and maintaining a DG server is much easier than installing the server sw of utility grids Disadvantage: Dynamic job submission is not supported Supported application is static (typically very long running application)

Master/slave parallelism and parameter studies Internet Master Workunit-1 Workunit-2 Workunit-3 Workunit-N DG Server

Types of Desktop Grids Global Desktop Grid Aim is to collect resources world-wide for grand- challenge scientific problems Examples: SZTAKI Desktop Grid global version at: Local Desktop Grid Aim is to enable the quick and easy creation of grid for any community (company, univ. city, etc.) to solve their own applications Example: SZTAKI Desktop Grid local version

Firewall Global Desktop Grids NAT clients join the server voluntarily public resources and a public project resources can be attached to more Desktop Grid projects resources are mostly behind Firewalls or NAT resources come and go, fluctuating performance

Local Desktop Grids Institute/ Dept. private (dedicated) resources, private project(s) more freedom for applications choice and administration oriented for companies and institutes more predictable performance Firewall

SZTAKI Desktop Grid: Global version Main objective: Demonstrate the usage of DG systems for any community Supported application: searching 12-dimension binary number systems Number of registered donors: >19000 Number of registered PCs: > How to register a PC?

SZTAKI Desktop Grid global version

SZTAKI Desktop Grid performance TOP 500 entry performance:1645 GFlops

SZTAKI Desktop Grid donors (users)

SZTAKI Desktop Grid local version Main objective: Enable the creation of a local DG for any community Demonstrate how to create such a system Building production service Grids requires huge effort and represents a privilege for those organizations where high Grid expertise is available Using the local SZDG package Any organization can build a local DG in a day with minimal effort and with minimal cost The applications of the local community will be executed by the spare PC cycles of the local community There is no limitation for the applied PCs, all the PCs of the organization can be exploited (heterogeneous Grid)

Usage of Sztaki Desktop Grid

DSP application on a local SZDG in the Univ. of Westminster Digital Signal Processing Appl.: Designing optimal periodic nonuniform sampling sequences Currently more than 100 PCs connected from Westminster and planned to extend over 1000 PCs DSP sizeProductionSZDG ~35min~1h 44min ~7h 23min ~141h~46h 46min The speedup ~5h 4min Sequential ~3h 33min ~41h 53min ~724h

Usage of local SZDG in industry Comgenex Ltd. -> AMRI Hungary Drug discovery application Creating enterprise Grid for prediction of ADME/Tox parameters Millions of molecules to test according to potential drug criteria Hungarian Telecom Creating enterprise Grid for supporting large data mining applications where single computer performance is not enough OMSZ (Hungarian Meteorology Service) Creating enterprise Grid for climate modeling

Normal Desktop Grid (the current SZDG, available as package) Desktop Grid with cluster extension (available as package) Goal is to include clusters in an SZDG (EU CancerGrid Project) Hierarchical Desktop Grid (available as prototype) Goal is to build larger DGs from smaller ones in a hierarchical way E.g. Enterprise DG can be built exploiting PCs of the dept. DGs Expected release by October 2007 (Hungarian DG project) Components of SZDG

LocalDEG Normal Desktop Grid University Dept. DG University Faculty DG Each local DG runs the applications of the local community (univ. dept., faculty, enterprise, etc.) Enterprise DG

LocalDEG Mixed Desktop Grid University Dept. DG University Faculty DG Enterprise DG Local DGs can be extended with local clusters

LocalDEG Hierarchical Desktop Grid University DG Enterprise Dept. DG Local DGs at the lower level of hierarchy can be used to solve the applications of the higher level DGs. E.g. univ. dept. and faculty DGs contribute to the university level DG University Dept. DG University Faculty DG Enterprise DG

Hierarchy DG projects need modifications only for app registration

Challenges based on the prototype we implemented redundant workunits application representation application registration trust workunit deadlines

Redundant workunits redundancy can be enabled only on the highest level it is automatically disabled on every other level

Workunit deadlines deadlines ensure that no workunit can be hijacked deadline is set when downloaded problem in hierarchy workunits transferred to lower levels have the deadline timer ticking workunit download management required

Trust public/private keypair for code signing and workunit signing thats not enough for the hierarchy From where should I accept applications and work ? Who/ what guarantees that the app does no harm ? How can I trust the work provider ?

Trust to solve the problem we introduce X.509 certificates not just for code signing and wu signing, but also for distinguishing the Application Developer the Project the Server the Client

Application representation the application has a name and a version the Application Developer signs the application the application is identified by the name, version, signature

Application registration

If we trust the Application Developer, we also trust the application signed by her the project has a CA List with the list of trusted Application Developers and Projects if the Application Developer is trusted, the application is deployed The Project may sign the application

The application registration process signsign install attach contact download cert, CA list download cert, CA list wu query get app wu query deploy

Conclusions of hierarchical SZDG Hierarchy is possible with minimal modification of existing projects For increased security or industrial usage certificates provide a good solution By introducing certificates most of the limitations can be solved The hierarchical version of SZDG works at prototype level and will be released during this year

DC API: available in the current SZDG as package (Hungarian DG project) Goal is to provide an easy-to-use library for generating Master/worker type Grid applications both for DG and SG systems See the next lecture by Gabor Gombas Portal access to SZDG: (in planning phase within the EU CancerGrid project) Goal is to provide a high-level, graphical workflow level portal to generate and run SZDG applications Application development support in SZDG

Portal support for SZDG Disadvantage of BOINC: Creation BOINC applications is a difficult task A BOINC application is static, not submittable Our goal: Enable the dynamic submission of workflow and parameter sweep applications into the SZDG system Make it easy to create and submit workflow and parameter sweep applications for SZDG

An example CancerGrid workflow x1 xN NxM xN NxM Generator job N = 20e-30e, M = 100 => very large number of executions and files

P-GRADE Portal Submitter Service A web service component of the portal Receives job submission requests from the portal workflow manager Must be prepared to do the following tasks: Submit the job Check the submitted jobs status Get the jobs output Abort the job

Submitter - Overview P-GRADE Portal A BC D WF Manager GRID Submitter Service B Submit B Submit, Check stat, Abort, Get output Report status change

Portal – SZDG integration tasks Create a special submitter for BOINC Manipulate BOINC database: workunit creation, result status query, … Glue together short-running algorithm instances Consider job and job owner priorities if needed Goal: minimize overhead of BOINC, minimize network traffic, but do not increment response time to users

CancerGrid DB MS SQL Portal + SZDG (BOINC) Port a l BOINC Szerver Portal Storage Submitter Queue Manager + Scheduler BOINC DB - MySQL Q1Q1 QnQn Donor WU Q2Q2 2D3D Mopac The submitter stores the jobs sent by the portal in algorithm queues Once a queue contains enough jobs, a BOINC workunit is created

Solution – Short-running jobs Instances of algorithms running for a short time should be glued together in order to increase computation time and minimize BOINC overhead Sheduling questions: How many jobs are enough? How long should we wait for new jobs? What about algorithm/user priorities? Detect bad donors? A Queue Manager takes care of algorithm queues

Enabling Desktop Grids for e-Science (EDGeS) New FP7 project to be started on the 01/01/2008 Goals of the project: To integrate Service Grids (SG) and Desktop Grids (DG) to attract new scientific communities that needs very large number of computing resources To involve new type of user and resource provider communities beyond the scientific communities (school students, citizens of cities, companies) To provide APIs and Grid application development tools for the new scientific user communities in order to adapt their applications for the integrated SG-DG e-infrastructure To adapt the identified applications for the integrated SG-DG e- infrastructure To provide new trust mechanisms for the integrated SG-DG e-infrastructure To establish international collaborations, procedures and standards To contribute to the establishment of a sustainable Grid infrastructure in Europe The infrastructure to be established: Integrated Service Grid – Desktop Grid system

Sevice Grid (EGEE) Public DG SZDG PCs Local DG UoW Grid PCs Public DG IN2P3 Grid 300 PCs Public DG EGEE-BOINC Planned PCs Public DG Extremadura Grid PCs Public DG EGEE- XtremWeb PCs Public DG AlemereGrid PCs BOINC based DGs Local DG IN2P3 Grid 200 PCs XtremWeb based DGs The EDGeS infrastructure

LocalDEG Production Grid EGEE Variety of DG systems within EDGeS University DG Enterprise DG Public DG

SG (EGEE) Local DG Public DG Local DG Public DG Local DG Interoperability of EGEE with DG systems

SG (EGEE) Local DG Public DG SG (SEE- Grid) SG (EELA) Public DG Local DG Public DG Local DG Towards unlimited Grid resources

Assessment of SZDG Advantages Easy to create and maintain Any organization can quickly and cheaply install it Easy to program and hence no steep learning curve Robust technology Industry can use it as enterprise Grid It extends BOINC with: Clusters as donated client systems Hierarchy of DG systems DC-API and portal access for easy generation of DG applications

Conclusions Desktop Grids are here for any community (universities, companies, etc.). They can access and/or build Grid systems SZTAKI Desktop Grid technology enables easy creation and programming of global and local desktop Grids SZTAKI is ready to help any organization to set up its local DG(s) to port applications on local DGs to train people how to build and use local DGs More information on: