UltraScan Gateway Advanced Support GIG Team: Suresh Marru, Raminder Singh, Marlon Pierce Pervasive Technology Institute Indiana University Gateway Personal:

Slides:



Advertisements
Similar presentations
GRADD: Scientific Workflows. Scientific Workflow E. Science laboris Workflows are the new rock and roll of eScience Machinery for coordinating the execution.
Advertisements

LEAD Portal: a TeraGrid Gateway and Application Service Architecture Marcus Christie and Suresh Marru Indiana University LEAD Project (
Abstraction Layers Why do we need them? –Protection against change Where in the hourglass do we put them? –Computer Scientist perspective Expose low-level.
SALSA HPC Group School of Informatics and Computing Indiana University.
Wrapping Scientific Applications as Web Services Gopi Kandaswamy (RENCI) Marlon Pierce (IU)
GridChem and ParamChem: Science Gateways for Computational Chemistry (and More) Marlon Pierce, Suresh Marru Indiana University Sudhakar Pamidighantam NCSA.
XSEDE 13 July 24, Galaxy Team: PSC Team:
ProActive Task Manager Component for SEGL Parameter Sweeping Natalia Currle-Linde and Wasseim Alzouabi High Performance Computing Center Stuttgart (HLRS),
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
Science Gateway Advanced Support Activities in PTI Marlon Pierce Indiana University.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Workload Management Massimo Sgaravatto INFN Padova.
Open Grid Computing Environments Marlon Pierce, Suresh Marru, Gregor von Laszewski, Mary Thomas, Maytal Dahan, Gopi Kandaswamy, and Wenjun Wu.
Globus 4 Guy Warner NeSC Training.
Kate Keahey Argonne National Laboratory University of Chicago Globus Toolkit® 4: from common Grid protocols to virtualization.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Apache Airavata GSOC Knowledge and Expertise Computational Resources Scientific Instruments Algorithms and Models Archived Data and Metadata Advanced.
Future Grid Future Grid User Portal Marlon Pierce Indiana University.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
National Center for Supercomputing Applications GridChem: Integrated Cyber Infrastructure for Computational Chemistry Sudhakar.
A Lightweight Platform for Integration of Resource Limited Devices into Pervasive Grids Stavros Isaiadis and Vladimir Getov University of Westminster
DISTRIBUTED COMPUTING
CloudCom Software for Science Gateways: Open Grid Computing Environments Marlon Pierce, Suresh Marru Pervasive Technology Institute Indiana University.
Software for Science Gateways: Open Grid Computing Environments Marlon Pierce, Suresh Marru Pervasive Technology Institute Indiana University
Developing Cyberinfrastructure to Support Computational Chemistry Workflows Marlon Pierce (IU), Suresh Marru (IU), Sudhakar Pamidighantam (NCSA) Sashikiran.
OGCE Workflow Suite GopiKandaswamy Suresh Marru SrinathPerera ChathuraHerath Marlon Pierce TeraGrid 2008.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
GT Components. Globus Toolkit A “toolkit” of services and packages for creating the basic grid computing infrastructure Higher level tools added to this.
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
23:48:11Service Oriented Cyberinfrastructure Lab, Grid Portals Fugang Wang April 29
Open Grid Computing Environments: Advanced Gateway Support Activities RT Project Review October 7 th, 2010.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
GRAM5 - A sustainable, scalable, reliable GRAM service Stuart Martin - UC/ANL.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
WS-PGRADE portal and its usage in the CancerGrid project M. Kozlovszky, P. Kacsuk Computer and Automation Research Institute of the Hungarian Academy of.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
SALSA HPC Group School of Informatics and Computing Indiana University.
Apache Airavata (Incubating) Gateway to Grids & Clouds Suresh Marru Nov 10 th 2011.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
TeraGrid CTSS Plans and Status Dane Skow for Lee Liming and JP Navarro OSG Consortium Meeting 22 August, 2006.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Large Scale Nuclear Physics Calculations in a Workflow Environment and Data Provenance Capturing Fang Liu and Masha Sosonkina Scalable Computing Lab, USDOE.
1 Grid Portal for VN-Grid Cu Nguyen Phuong Ha. 2 Outline Some words about portals in principle Overview of OGCE GridPortlets.
 Apache Airavata Architecture Overview Shameera Rathnayaka Graduate Assistant Science Gateways Group Indiana University 07/27/2015.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Services for advanced workflow programming.
OGCE Components for Enhancing UltraScan Job Management. Suresh Marru,Raminder Singh, Marlon Pierce.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
International Symposium on Grid Computing (ISGC-07), Taipei - March 26-29, 2007 Of 16 1 A Novel Grid Resource Broker Cum Meta Scheduler - Asvija B System.
A Technical Overview Bill Branan DuraCloud Technical Lead.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
GridChem Architecture Overview Rion Dooley. Presentation Outline Computational Chemistry Grid (CCG) Current Architectural Overview CCG Future Architectural.
BioVLAB-Microarray: Microarray Data Analysis in Virtual Environment Youngik Yang, Jong Youl Choi, Kwangmin Choi, Marlon Pierce, Dennis Gannon, and Sun.
The Gateway Computational Web Portal Marlon Pierce Indiana University March 15, 2002.
OGCE Workflow and LEAD Overview Suresh Marru, Marlon Pierce September 2009.
EGI Technical Forum Amsterdam, 16 September 2010 Sylvain Reynaud.
A Desktop Client for HPC Chemistry Applications: GridChem Kent Milfeld Supported by the NSF NMI Program under Award #
Open Grid Computing Environment Summary
Enable computational and experimental  scientists to do “more” computational chemistry by providing capability  computing resources and services at their.
Open Grid Computing Environments
Presentation transcript:

UltraScan Gateway Advanced Support GIG Team: Suresh Marru, Raminder Singh, Marlon Pierce Pervasive Technology Institute Indiana University Gateway Personal: Borries Demeler, Emre Brookes UT Health Science Center

Outline UltraScan Gateway –Target Community –Usage Statistics –Current Architecture Advanced Support Request –Requested Support –Current Status Gateway Enhancement Plan –Overview of OGCE –Customizing/Extending/Integrating OGCE tools with UltraScan April 16 th 2010

Slide Courtesy This Talk is derived from Advanced Support Request submitted by Dr. Borries Demeler April 16 th 2010

UltraScan Science Gateway A biophysics gateway for investigating properties and structure-function relationships of biological macromolecules, nanoparticles, polymers and colloids that are implicated in many diseases, including cancer. High-resolution analysis and modeling of hydrodynamic data from an analytical ultracentrifuge device (AUC). April 16 th 2010

AUC Experimental Setup Analytical Ultra Centrifuge

UTHSCSA Jacinto Terascale storage Web Server US LIMS MySQL DB User High Performance Computing Clusters TeraGrid TIGRE/Globus Network GridControl High Level Overview

Application & Gateway Software Application Software: UltraScan – provides the highest resolution analysis possible for AUC experiments. –solves the inverse problem of extracting molecular parameters from experimental data on parallel computing infrastructure. Gateway Software:US-LIMS – Ultrascan Laboratory Information System –Provides access to HPC resources to broader research community –Experiment Management Interfaces April 16 th 2010

Ultrascan Usage by month April 16 th 2010

TG Usage in 2008 and 2009 April 16 th 2010

Example: CuZn hSOD mutant freshly isolated Data courtesy of John Hart, UTHSCSA Example Data Analysis

Example: CuZn hSOD mutant after 7 days at 4C Data courtesy of John Hart, UTHSCSA Applying Monte Carlo Simulations

ASTA Request 1.Porting to new architectures and parallel performance enhancements 2.Porting to and incorporation of TeraGrid storage and server environments 3.New workflow implementations, new grid computing and grid middleware support 4.Additional parallelization's to improve scaling for Monte Carlo analyses 5.Hardening of code performance through check- pointing and processor failure detection April 16 th 2010

13 OVP/ RST/ MIG OGCE Re-engineer, Generalize, Build, Test and Release LEAD GridChem TeraGrid User Portal OGCE Team GridChem Ultrascan BioVLab ODI Bio Drug Screen EST Pipeline Future Grid GFac, XBaya, XRegistry, FTR Eventing System LEAD Resource Discovery Service GPIR, File Browser Gadget Container, GTLab, Javascript Cog, XRegistry Interface, Experiment Builder, Axis2 Gfac, Axis2 Eventing System, Resource Prediction Service Experiment Builder, XRegistry Interface XBaya Gfac, Eventing System XBaya, GFac Workflow Suite ??? OGCE Gateway Tool Adaption & Reuse

OGCE Workflow Suite Generic Service Toolkit –Tool to wrap command-line applications as web services –Handles file staging & job submission and monitoring –Extensible runtime for security, resource brokering & urgent computing –Generic Factory service for on-demand creation of application services XRegistry –Information repository for the OGCE workflow suite –Register, search, retrieve & share XML documents –User & hierarchical group based authorization XBaya –GUI based tool to compose & monitor workflows –Extensible support for compiler plug-ins like BPEL, Jython, SCUFL –Dynamic Workflow Execution support to start, pause, resume, rewind of workflow executions Apache ODE Scientific Workflow Extensions –XBaya GUI integration for BPEL Generation –Asynchronous support for long running workflows –Instrumented with fine grained monitoring Eventing System –Supports both WS-Eventing and WS-Notification Standards –Very scalable –Persistent Message Box for clients behind firewalls and with intermittent network glitches.

GridChem Science Gateway A chemistry/material Science Gateway for running computational chemistry codes, workflows, and parameter sweeps. Integrates molecular science applications and tools for community use users heavily using TeraGrid. One of the consistent top5 TeraGrid Gateway users. Supports all popular Chemistry applications including Gaussian, GAMESS, NWChem, QMCPack, Amber and MolPro, CHARMM

GridChem Advanced Support GridChem supports single application executions Advanced support request for supporting workflows Improved Fault Tolerance

GridChem OGCE Integration OGCE workflow tools wrapped Gaussian & Charmm chemistry applications Coupled Butane workflow using Gaussian & Charmm Integration 100 member gaussian parametric sweeps Integration with Pegasus workflow tools

GridChem Using OGCE Tools Initial Structure Optimized Structure GridChem using OGCE Workflow Tools to construct and execute CHARMM and Gaussian Molecular chemistry Models

Gateway Middleware UltraScan LIMS DB Apache Web Interface User Analysis Control Unit Cluster 1 Job Scheduler Cluster 1 Job Scheduler Distributor Module S1 S2 Sn... Cluster 1 Job Scheduler Cluster 2 Job Scheduler Distributor Module S1 S2 Sn... Cluster 1 Job Scheduler Cluster n Job Scheduler Distributor Module S1 S2 Sn... UltraScan Gateway Architecture

Gateway Advanced Support Coordinate with HPC ASTA Support in Porting to new architectures and parallel performance enhancements. New workflow implementations, new grid computing and grid middleware support: –Reliability problems with WSGram –Missing job status –Only supports Gram4, needs porting to other middleware –Issues with data movement. –Need Fault tolerance at all levels. –Users decide resources manually, need automated scheduling. Current Architecture

Derived ASTA Requirements Enhance the perl job submission daemon with OGCE Gfac service. Enhance socket and based job monitoring with OGCE Eventing System Implement and iteratively enhance fault tolerance. Port to Community account usage with gridshib auditing support. Support Unicore to run jobs on European and Australian resources. April 16 th 2010

GFac Science Application Wrapper Tool

GFac Existing & Requested Features Input Handlers Scheduling Interface Auditing Monitoring Interface Data Management Abstraction Job Management Abstraction Job Management Abstraction Fault Tolerance Output Handlers Registry Interface Checkpoint Support Apache Axis2 Globus Campus Resources Unicore Gram5 Amazon Eucalyptus Color Coding UltraScan Requested Features Existing Features

Target Compute Resource UT Health Science Center Clusters Texas State Grid: HiPCaT– TIGRE software stack TeraGrid Resources – Ranger, Lonestar, Abe/Queenbee, Bigred Clusters in Germany and Australia April 16 th 2010

OGCE based UltraScan development Architecture GFac, Eventing System, Fault Tolerance UltraScan Middleware Quarry Gateway Hosting Machine Manual Process

Current Status Deployed Ultrascan software on IU Gateway Hosting Service Replicated the production system to facilitate Gateway & HPC ASTA Testing Deployed UltraScan application on Queenbee & Ranger Working with GRAM 5 Integration on Ranger April 16 th 2010

Future Work Add support to Unicore Job Management Build Fault Tolerance Reiterate ASTA Requirements April 16 th 2010