1 2010-04-27 G. Terstyanszky, T. Kukla, T. Kiss, S. Winter, J.: Centre for Parallel Computing School of Electronics and Computer Science, University of.

Slides:



Advertisements
Similar presentations
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Advertisements

P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann P. Kacsuk, G. Sipos, A. Toth, Z. Farkas, G. Kecskemeti and G. Hermann MTA SZTAKI.
Investigating Approaches to Speeding Up Systems Biology Using BOINC-Based Desktop Grids Simon J E Taylor (1) Mohammadmersad Ghorbani (1) David Gilbert.
P-GRADE and WS-PGRADE portals supporting desktop grids and clouds Peter Kacsuk MTA SZTAKI
WS-PGRADE: Supporting parameter sweep applications in workflows Péter Kacsuk, Krisztián Karóczkai, Gábor Hermann, Gergely Sipos, and József Kovács MTA.
Extending a molecular docking tool to run simulations on clouds Damjan Temelkovski Dr. Tamas Kiss Dr. Gabor Terstyanszky University of Westminster.
CENTRE FOR PARALLEL COMPUTING 8th IDGF Workshop Hannover, August 17 th 2011 International Desktop Grid Federation.
Using the WS-PGRADE Portal in the ProSim Project Protein Molecule Simulation on the Grid Tamas Kiss, Gabor Testyanszky, Noam.
Application of e-infrastructure to real research.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
EDGI European Desktop Grid Initiative EDGI Portal usage and use-case studies EDGI is supported by the FP7 Capacities Programme under contract.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Application Case Study: Distributed.
1 Developing domain specific gateways based on the WS- PGRADE/gUSE framework Peter Kacsuk MTA SZTAKI Start date: Duration:
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI Creating the Autodock gateway from WS-PGRADE/gUSE and making it cloud-enabled.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
Protein Molecule Simulation on the Grid G-USE in ProSim Project Tamas Kiss Joint EGGE and EDGeS Summer School.
1 IDGF International Desktop Grid Federation How can you benefit from joining IDGF? Hannover, Peter Kacsuk, MTA SZTAKI, EDGI.
Parameter Sweep Workflows for Modelling Carbohydrate Recognition ProSim Project Tamas Kiss, Gabor Terstyanszky, Noam Weingarten.
Sharing Workflows through Coarse-Grained Workflow Interoperability : Sharing Workflows through Coarse-Grained Workflow Interoperability G. Terstyanszky,
The EDGeS project receives Community research funding 1 SG-DG Bridges Zoltán Farkas, MTA SZTAKI.
INFSO-RI Enabling Grids for E-sciencE Supporting legacy code applications on EGEE VOs by GEMLCA and the P-GRADE portal P. Kacsuk*,
WS-PGRADE portal and its usage in the CancerGrid project M. Kozlovszky, P. Kacsuk Computer and Automation Research Institute of the Hungarian Academy of.
Application portlets within the PROGRESS HPC Portal Michał Kosiedowski
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Grid programming with components: an advanced COMPonent platform for an effective invisible grid © 2006 GridCOMP Grids Programming with components. An.
P-GRADE and GEMLCA.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
The EDGeS project receives Community research funding 1 Porting Applications to the EDGeS Infrastructure A comparison of the available methods, APIs, and.
Cooperative experiments in VL-e: from scientific workflows to knowledge sharing Z.Zhao (1) V. Guevara( 1) A. Wibisono(1) A. Belloum(1) M. Bubak(1,2) B.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Services for advanced workflow programming.
INFSO-RI Enabling Grids for E-sciencE EGEE Review WISDOM demonstration Vincent Bloch, Vincent Breton, Matteo Diarena, Jean Salzemann.
MDPHnet & ESP Data Partner Participation Overview The following slides describe the necessary steps for a data partner to participate in the MDPHnet Network.
The EDGeS project receives Community research funding 1 Porting applications for a combined EGEE/Desktop Grid platform in the framework of the EDGeS infrastructure.
INFSO-RI Enabling Grids for E-sciencE Use Case of gLite Services Utilization. Multiple Ligand Trajectory Docking Study Jan Kmuníček.
SHIWA and Coarse-grained Workflow Interoperability Gabor Terstyanszky, University of Westminster Summer School Budapest July 2012 SHIWA is supported.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
11 Introduction to EDGI Peter Kacsuk, MTA SZTAKI Start date: Duration: 27 months EDGI.
Building an European Research Community through Interoperable Workflows and Data ER-flow project Gabor Terstyanszky, University of Westminster, UK EGI.
SHIWA: Is the Workflow Interoperability a Myth or Reality PUCOWO, June 2011, London Gabor Terstyanszky, Tamas Kiss, Tamas Kukla University of Westminster.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
The EDGeS project receives Community research funding 1 Support services for desktop grids and service grids by the EDGeS project Tamas Kiss – University.
Remote Api Tutorial How to call WS-PGRADE workflows from remote clients through the http protocol?
Autoligand is a script that comes with autodock tools. It has two modes: Find #n binding site within the grid region. Define shape and volume of binding.
SHIWA Simulation Platform (SSP) Gabor Terstyanszky, University of Westminster EGI Community Forum Munnich March 2012 SHIWA is supported by the FP7.
Usage of WS-PGRADE and gUSE in European and national projects Peter Kacsuk 03/27/
11 The European Desktop Grid Federation: status of the infrastructure and integration plans Peter Kacsuk, Jozsef Kovacs, and Robert Lovas (MTA SZTAKI)
RI EGI-TF 2010, Tutorial Managing an EGEE/EGI Virtual Organisation (VO) with EDGES bridged Desktop Resources Tutorial Robert Lovas, MTA SZTAKI.
11 Extending EMI middleware with DGs Peter Kacsuk, MTA SZTAKI Start date: Duration:
1 Globe adapted from wikipedia/commons/f/fa/ Globe.svg IDGF-SP International Desktop Grid Federation - Support Project SZTAKI.
Occopus and its usage to build efficient data processing workflow infrastructures in clouds József Kovács, Péter Kacsuk, Ádám Novák, Ádám Visegrádi MTA.
Docking and Virtual Screening Using the BMI cluster
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
FESR Consorzio COMETA - Progetto PI2S2 Molecular Modelling Applications Laura Giurato Gruppo di Modellistica Molecolare (Prof.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
SCI-BUS project Pre-kick-off meeting University of Westminster Centre for Parallel Computing Tamas Kiss, Stephen Winter, Gabor.
Convert generic gUSE Portal into a science gateway Akos Balasko.
Centre for Parallel Computing Tamas Kiss Centre for Parallel Computing A Distributed Rendering Service Tamas Kiss Centre for Parallel Computing Research.
SHIWA SIMULATION PLATFORM = SSP Gabor Terstyanszky, University of Westminster e-Science Workflows Workshop Budapest 09 nd February 2012 SHIWA is supported.
Peter Kacsuk, Zoltan Farkas MTA SZTAKI
The EDGI (European Desktop Grid Initiative) infrastructure and its usage for the European Grid user communities József Kovács (MTA SZTAKI)
Tamas Kiss University Of Westminster
P-GRADE and GEMLCA.
Ligand Docking to MHC Class I Molecules
Module 01 ETICS Overview ETICS Online Tutorials
University of Westminster Centre for Parallel Computing
Consortium: National networks in 16 European countries.
Consortium: National networks in 16 European countries.
Introduction to the SHIWA Simulation Platform EGI User Forum,
Presentation transcript:

G. Terstyanszky, T. Kukla, T. Kiss, S. Winter, J.: Centre for Parallel Computing School of Electronics and Computer Science, University of Westminster London, United Kingdom J. Kovacs, Z. Farkas, P. Kacsuk MTA-SZTAKI Budapest, Hungary, Combining Desktop and Service Grids to Support e-Scientists to Run Simulations European Desktop Grid Infrastructure = EDGI

2 2 Binding pocket Sugar (ligand) Protein (receptor) Docking and Molecular Dynamics Simulations

3 In-vitro (or wet lab) research It investigates components of an organism that have been isolated from their usual biological surroundings in order to permit a more detailed and convenient analysis than can be done with whole organisms. In-silico simulation It simulates components of an organism for example docking of ligands and proteins downloading them from public libraries, binding them and analysing the properties of the compound molecules. Aims of in-silico docking simulation Understanding how pathogens bind to cell surface proteins can lead to the design of carbohydrate-based drugs and diagnostic and therapeutic agents Highlighting potential novel inhibitors and drugs for in vitro and on-chip testing.

4 Advantages of in-silico methods: Reduced time and cost In vitro experiments are expensive Better focusing wet laboratory resources: Better planning of experiments by selecting best molecules to investigate Increased number of molecules screened Problems of in-silico experiments: Time consuming Weeks or months on a single computer Simulation tools are too complex for an average bio-scientist Linux command line interfaces Bio-molecular simulation tools are not widely tested and validated Are the results really useful and accurate? Docking and Molecular Dynamics Simulations

5 In-silico Simulation in Service Grids PDB file 1 (Receptor) PDB file 2 (Ligand) Energy Minimization (Gromacs) Validate (Molprobity) Check (Molprobity) Perform docking (AutoDock) Molecular Dynamics (Gromacs) Phase 1 Phase 2 Phase 3 Phase 4

6 phase 1 – pre-processing of protein phase 2 – pre-processing of sugar phase 3 – docking phase 4 – molecular dynamics simulation Executed on 5 different sites of the UK NGS Parameter sweeps in phase 3 and 4 MPI in phase 4 In-silico Simulation in Service Grids

EDGI Infrastructure

Usage Scenario in Desktop – Service Grids EDGI Portal SG Broker Compute Element(n) SG->DG Bridge Desktop Grid Server EDGI Application Repository Service Grid Desktop Grid Compute Element(2) Compute Element(1) Worker Node(m) Worker Node(2) Worker Node(1) search, select & download application’s implementation submit application’s implementation retrieve & deploy impl e-scientist DG admin query implementation

EDGI Application Repository: Actors, Entities and Operations user /group man platform man. upload appl. mark appl valid browse/ search appl. download appl. E-scientistsxx Application Developers xxxx Application Validators xx Desktop Grid Administrators xx Repository Administrators xxxxxx with registration without registration Repository Entities Application represents an application which implementations can be executed on the EDGI infrastructure. It describes the inputs and outputs and explains what the application does. Implementation is an application implementation. It contains references (via e.g. URLs) to all the files and data necessary to run the application on a given platform and metadata. Platform describes desktop Grid and/or service Grid environment where the implementation can be executed. Configuration contains the implementation files required to run the applications. Repository Actors and Operations

10 Main menu: select users & groups + applications (implementations) + platforms + validation pages Action menu: create/delete entities + upload/download applications & implementations add/edit/remove metadata Search: users & groups + applications & implementations + platforms EDGI Application Repository: User Interface

11 EDGI Application Repository: Application Metadata

12 EDGI Application Repository: Implementation Metadata

EDGI Application Repository in the EDGI Infrastructure EDGI Portal SG Broker Compute Element(n) SG->DG Bridge Desktop Grid Server EDGI Application Repository Service Grid Desktop Grid Compute Element(2) Compute Element(1) Worker Node(m) Worker Node(2) Worker Node(1) search, select & download application’s implementation submit application’s implementation retrieve & deploy impl e-scientist DG admin query implementation

DG clients: New Cavendish St 576 nodes Marylebone Campus 559 nodes Regent Street 395 nodes Wells Street 31 nodes Little Titchfield St 66 nodes Harrow Campus 254 nodes Lifecycle of a DG node: 1.PCs basically used by students/staff 2.If unused, switch to Desktop Grid mode 3.No more work from DG server -> shutdown (green solution) University of Westminster Local Desktop Grid

15 gpf file pdb file (ligand) pdb file (receptor) prepare_ligand4. py prepare_receptor 4.py pdbqt file AUTOGRID AUTODO CK map files Bio Scientist dpf file AUTODO CK dlg files SCRIPT1 SCRIPT2 best dlg files pdb file In Silico Docking User Scenario Research objectives: Constructing a library of tens of thousands of small molecule candidates available in databases (eg. DrugBank) and preparing PDBQT files To be screened against known targets using Autodock Vina Small molecule library will be made available to other researchers Promising candidates can be validated in vitro

16 In-Silico Docking Workflow receptor.pdb ligand.pdb Autogrid executables, Scripts (uploaded by the developer, don’t change it) gpf descriptor file dpf descriptor file output pdb file The Generator job creates specified numbered of AutoDock jobs. The AutoGrid job creates pdbqt files from the pdb files, runs the autogrid application and generates the map files. Zips them into an archive file. This archive will be the input of all AutoDock jobs. The AutoDock jobs are running on the Desktop Grid. As output they provide dlg files. The Collector job collects the dlg files. Takes the best results and concatenates them into a pdb file. dlg files number of work units

17 Free access to pre-deployed molecular docking “primitive” scenarios running on the EDGI infrastructure  Random blind docking and virtual screening DG versions of applications are coming from the EDGI AR Docking workflows are executed on the Desktop Grid EDGI Docking Portal

18 Docking the Protozoan Neuraminidase

19 Docking the Protozoan Neuraminidase

20 Computer Scientists They created the combined desktop grid and service grid infrastructure where e-scientists can run their application on They created the combined desktop grid and service grid infrastructure where e-scientists can run their application on The EDGI Application Repository and Portal is able to support application developers, e-scientists and application validators The EDGI Application Repository and Portal is able to support application developers, e-scientists and application validators Bio Scientists The EDGI infrastructure can provide potential for unlimited computational power to the biologists The EDGI infrastructure can provide potential for unlimited computational power to the biologists They can offer access to methodology (application porting) and tools (portal and repository) They can offer access to methodology (application porting) and tools (portal and repository) They have a library of small molecules available for screening and access to Chip based technology They have a library of small molecules available for screening and access to Chip based technology Conclusions