WS-PGRADE portal and its usage in the CancerGrid project M. Kozlovszky, P. Kacsuk Computer and Automation Research Institute of the Hungarian Academy of.

Slides:



Advertisements
Similar presentations
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Advertisements

P-GRADE and WS-PGRADE portals supporting desktop grids and clouds Peter Kacsuk MTA SZTAKI
WS-PGRADE: Supporting parameter sweep applications in workflows Péter Kacsuk, Krisztián Karóczkai, Gábor Hermann, Gergely Sipos, and József Kovács MTA.
Grid Execution Management for Legacy Code Applications Exposing Application as Grid Services Porto, Portugal, 23 January 2007.
Porto, January Grid Computing Course Summary of day 2.
EXTENDING SCIENTIFIC WORKFLOW SYSTEMS TO SUPPORT MAPREDUCE BASED APPLICATIONS IN THE CLOUD Shashank Gugnani Tamas Kiss.
1 Application Specific Module for P-GRADE Portal 2.7 Application Specific Module overview Akos Balasko MTA-SZTAKI LPDS
Developing an Application-Specific Portal with P-GRADE Portal 2.9 Author: Ákos Balaskó, Date :
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
1 portal.p-grade.hu További lehetőségek a P-GRADE Portállal Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI WS-PGRADE/gUSE Supporting e-Science communities in Europe Zoltan Farkas.
Using the WS-PGRADE Portal in the ProSim Project Protein Molecule Simulation on the Grid Tamas Kiss, Gabor Testyanszky, Noam.
1 portal.p-grade.hu Further information on P-GRADE Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI CloudBroker Platform integration into WS-PGRADE/gUSE Zoltán Farkas MTA.
Computer and Automation Research Institute Hungarian Academy of Sciences Presentation and Analysis of Grid Performance Data Norbert Podhorszki and Peter.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
WS – PGRADE Tutorial MTA SZTAKI Laboratory of Parallel and Distributed Systems (LPDS) M. Kozlovszky Research fellow
A General and Scalable Solution of Heterogeneous Workflow Invocation and Nesting Tamas Kukla, Tamas Kiss, Gabor Terstyanszky.
1 Developing domain specific gateways based on the WS- PGRADE/gUSE framework Peter Kacsuk MTA SZTAKI Start date: Duration:
07/06/11 New Features of WS-PGRADE (and gUSE) 2010 Q Q2 Miklós Kozlovszky MTA SZTAKI LPDS.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI Creating the Autodock gateway from WS-PGRADE/gUSE and making it cloud-enabled.
From P-GRADE to SCI-BUS Peter Kacsuk, Zoltan Farkas and Miklos Kozlovszky MTA SZTAKI - Computer and Automation Research Institute of the Hungarian Academy.
Protein Molecule Simulation on the Grid G-USE in ProSim Project Tamas Kiss Joint EGGE and EDGeS Summer School.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
Sharing Workflows through Coarse-Grained Workflow Interoperability : Sharing Workflows through Coarse-Grained Workflow Interoperability G. Terstyanszky,
Introduction to SHIWA Technology Peter Kacsuk MTA SZTAKI and Univ.of Westminster
INFSO-RI Enabling Grids for E-sciencE Supporting legacy code applications on EGEE VOs by GEMLCA and the P-GRADE portal P. Kacsuk*,
Introduction to WS-PGRADE and gUSE Tutorial Akos Balasko 04/17/
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
WS – PGRADE/gUSE Tutorial MTA SZTAKI Laboratory of Parallel and Distributed Systems (LPDS) M. Kozlovszky Senior Research fellow
Convert generic gUSE Portal into a science gateway Akos Balasko 02/07/
1 P-GRADE Portal: a workflow-oriented generic application development portal Peter Kacsuk MTA SZTAKI, Hungary Univ. of Westminster, UK.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
The EDGeS project receives Community research funding 1 Porting Applications to the EDGeS Infrastructure A comparison of the available methods, APIs, and.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Services for advanced workflow programming.
FRANEC and BaSTI grid integration Massimo Sponza INAF - Osservatorio Astronomico di Trieste.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Workflow repository, user.
Convert generic gUSE Portal into a science gateway Akos Balasko.
SHIWA and Coarse-grained Workflow Interoperability Gabor Terstyanszky, University of Westminster Summer School Budapest July 2012 SHIWA is supported.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI How to integrate portals with the EGI monitoring system Dusan Vudragovic.
11 Introduction to EDGI Peter Kacsuk, MTA SZTAKI Start date: Duration: 27 months EDGI.
SHIWA: Is the Workflow Interoperability a Myth or Reality PUCOWO, June 2011, London Gabor Terstyanszky, Tamas Kiss, Tamas Kukla University of Westminster.
Application Specific Module Tutorial Zoltán Farkas, Ákos Balaskó 03/27/
1 SCI-BUS: building e-Science gateways in Europe: building e-Science gateways in Europe Peter Kacsuk and Zoltan Farkas MTA SZTAKI.
1 P-GRADE Portal hands-on Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
1 Further information and next steps Further information and next steps Gergely Sipos MTA SZTAKI
1 WS-PGRADE/gUSE generic DCI gateway framework for EGI user communities Zoltan Farkas and Peter Kacsuk MTA SZTAKI SCI-BUS is supported.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI MTA SZTAKI background for the DARIAH CC Zoltan Farkas MTA SZTAKI LPDS,
OpenNebula: Experience at SZTAKI Peter Kacsuk, Sandor Acs, Mark Gergely, Jozsef Kovacs MTA SZTAKI EGI CF Helsinki.
WS-PGRADE/gUSE in use Advance use of WS- PGRADE/gUSE gateway framework Zoltán Farkas and Peter Kacsuk MTA SZTAKI LPDS.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
The EDGeS project receives Community research funding 1 Support services for desktop grids and service grids by the EDGeS project Tamas Kiss – University.
SHIWA Simulation Platform (SSP) Gabor Terstyanszky, University of Westminster EGI Community Forum Munnich March 2012 SHIWA is supported by the FP7.
Usage of WS-PGRADE and gUSE in European and national projects Peter Kacsuk 03/27/
1 Support for parameter study applications in the P-GRADE Portal Gergely Sipos MTA SZTAKI (Hungarian Academy of Sciences)
11 Extending EMI middleware with DGs Peter Kacsuk, MTA SZTAKI Start date: Duration:
1 Globe adapted from wikipedia/commons/f/fa/ Globe.svg IDGF-SP International Desktop Grid Federation - Support Project SZTAKI.
New developments of gUSE & WS-PGRADE to support e-science gateways Peter Kacsuk and Miklos Kozlovszky MTA SZTAKI - Computer and Automation Research Institute.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Converting P-GRADE Grid Portal into E-Science Gateways A. Balasko, M. Kozlovszky, K. Varga, A. Schnautigel, K. Karóckai, I. Márton, T. Strodl, P. Kacsuk.
Convert generic gUSE Portal into a science gateway Akos Balasko.
Tamas Kiss University Of Westminster
Introduction to gUSE and WS-PGRADE portal
Grid Application Support Group Case study Schrodinger equations on the Grid Status report 16. January, Created by Akos Balasko
Peter Kacsuk MTA SZTAKI
University of Westminster Centre for Parallel Computing
Introduction to the SHIWA Simulation Platform EGI User Forum,
Presentation transcript:

WS-PGRADE portal and its usage in the CancerGrid project M. Kozlovszky, P. Kacsuk Computer and Automation Research Institute of the Hungarian Academy of Sciences PUCOWO, Zurich, Switzerland 10-11/06/2010

2 Motivations of creating gUSE To overcome (most of) the limitations of P-GRADE portal: To provide better modularity  to replace any service To improve scalability  to millions of jobs To enable advanced dataflow patterns To interface with wider range of resources To separate Application Developer view from Application User view WS-PGRADE (Web Services Parallel Grid Runtime and Developer Environment) and gUSE (Grid User Support Environment) architecture

3 WS-PGRADE/gUSE Creating complex workflow and parameter sweeps Seamless access to various types of resources clusters, service grids, desktop grids, databases. Scalable architecture Advanced data-flows Creating complex applications using embedded workflows, legacy codes Comfort features Separated views Community components from workflow repository

4 WS-PGRADE architecture Graphical User Interface: WS-PGRADE Workflow Engine Workflow storage File storage Application repository Logging gUSE information system Submitters Gridsphere portlets Autonomous Services: high level middleware service layer Resources: middleware service layer Local resources, service grid VOs, Desktop Grid resources, Web services, Databases gUSE Meta-broker Submitters File storage Submitters

5 Application lifecycle in WS-PGRADE Define workflow structure Configure workflow Define content for tasks Run a test Use local resources, Web services, Databases Scale workflow for large simulations Use batch systems, use cluster grids, use desktop grids Fix some parameters, leave some open Result: An application specific science gateway for end users

6 WS-PGRADE application: Acyclic dataflow Job to run on dedicated machine Job to run in a gLite VO Job to run in a Globus 2/4 VO Task to run in a BOINC Grid Web service invocation Database operation (R / W) File from the client host File from a GridFTP site File from an LFC catalog Input string from a task or service Result of a Database query

7 Dataflow programming with gUSE Separate application logic from data Cross & dot product data-pairing Concept from Taverna All-to-all vs. one-to-one pairing of data items Generator components: to produce many output files from 1 input file Collector components: to produce 1 output file from many input files Any component can be generator or collector Conditional execution based on equality of data Nesting, cycle, recursion tasks

8 Ergonomics Users can be grid application developers or end-users. Application developers design sophisticated dataflow graphs Embedding into any depth, recursive invocations, conditional structures, generators and collectors at any position Publish applications in the repository at certain stages of work Applications Projects Concrete workflows Templates Graphs End-users see WS-PGRADE & gUSE as a science gateway List of ready to use applications in repository Import and execute application without knowledge of programming, dataflow or grid

9 Current users of gUSE EDGeS project (Enabling Desktop Grids for e-Science)EDGeS Integrating EGEE with BOINC and XtremWeb technologies User interfaces and tools ProSim projectProSim In silico simulation of intermolecular recognition See next presentation University of Westminster Desktop GridUniversity of Westminster Desktop Grid Using AutoDock on institutional PCs CancerGrid projectCancerGrid Predicting various properties of molecules to find anti-cancer leads Creating science gateway for chemists

10 Motivation to use gUSE and WS-PGRADE for CancerGrid Arbitrary number of generators (and collectors) within one workflow (at arbitrary locations). Scalability: Number of jobs within one workflow is at range: 100K…1M ! Import of the existing EndUser configuration GUI (Easy-to- use, web based, user specific) application specific portlet for end users was not needed.

11 molecule database Executing workflows Browsing molecules DG clients from all partners Molecule database server Portal and DesktopGrid server BOINC server 3G Bridge Portal DG jobs WU 1 WU 2 WU N Job 1 Job 2 Job N GenWrapper for batch execution BOINC client Legacy Application Portal Storage Local Resource Local jobs Legacy Application WU X WU Y

12 CancerGrid Workflows Descriptor CalculationProperty Prediction Model building Screening

13 Working on the CancerGrid Portal – step-by-step Initial state: molecules/structures stored in DB, organised into lists User selects list of molecules/structures User selects/downloads a workflow from repository User configures the workflow to take the list as input User optionally updates parameters of the modules Submits workflow Optionally monitors the status When workflow finished, results are stored in the DB

14 Molecule lists

15 Molecule viewer

16 Downloading workflow from repository – End user view

17 Workflow configuration

18 List of workflows (Novice user view)

19 Status monitor – End user view

20 Job statuses of a workflow - Developer view

21 Conclusions WS-PGRADE: Implemented on top of scalable, WS based gUSE architecture More expressive dataflow patterns Transparent access to Local resources Service Grids Desktop Grids Databases Web services Application repository Service for collaboration of developers and end-users

22 Next steps at User manual Request a user account

23 Thank you for your attention! Questions? Acknowledgement: CancerGrid EU FP6 project (FP LIFESCTHTALTH-7)

24 Applications in CancerGrid Flexmol is an XML-based molecular language Molecule 2D/3D converter (Cmol3D) Molecule 3D conformation generator (Cmol3D) MOPAC (Molecular Orbital PACkage) is a semiempirical quantum chemistry program based on Dewar and Thiel's NDDO approximation Codessa Pro (Comprehensive Descriptors for Structural and Statistical Analysis) is a software suite for developing quantitative structure-activity/property relationships Matrix former QSAR Model builder Quantitative structure-activity relationship (QSAR) is the process by which chemical structure is quantitatively correlated with a well defined process, such as biological activity or chemical reactivity. (Chemical) Property Predictor File format converters (to integrate the previous tools into a workflow)

25 Cmol3D property settings

26 The CancerGrid infrastructure PRODUCTION system gUSE portal BOINC server (private desktopgrid with firewall and controlled donor access) Monitoring info 69 machines (AMRI 10, SZTAKI 56, UPF 2, UoJ 1) TEST system gUSE portal BOINC server (private desktopgrid with firewall and controlled donor access) Monitoring info Performance measurements: mols6 days~70 machines10 confs