National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.

Slides:



Advertisements
Similar presentations
DATE: 2008/03/11 NCHC-Grid Computing Portal (NCHC-GCE Portal) Project Manager: Dr. Weicheng Huang Developed Team: Chien-Lin Eric Huang Chien-Heng Gary.
Advertisements

LEAD Portal: a TeraGrid Gateway and Application Service Architecture Marcus Christie and Suresh Marru Indiana University LEAD Project (
Experiences with GridWay on CRO NGI infrastructure / EGEE User Forum 2009 Experiences with GridWay on CRO NGI infrastructure Emir Imamagic, Srce EGEE User.
Test harness and reporting framework Shava Smallen San Diego Supercomputer Center Grid Performance Workshop 6/22/05.
Integrated Environment for Computational Chemistry on the APAC Grid Dr. Vladislav Vassiliev Supercomputer Facility, The Australian National University,
Using the Collaborative Tools in NEESgrid Charles Severance University of Michigan.
GridChem and ParamChem: Science Gateways for Computational Chemistry (and More) Marlon Pierce, Suresh Marru Indiana University Sudhakar Pamidighantam NCSA.
A Computation Management Agent for Multi-Institutional Grids
MTA SZTAKI Hungarian Academy of Sciences Grid Computing Course Porto, January Introduction to Grid portals Gergely Sipos
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
The Cactus Portal A Case Study in Grid Portal Development Michael Paul Russell Dept of Computer Science The University of Chicago
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Workload Management Massimo Sgaravatto INFN Padova.
Simo Niskala Teemu Pasanen
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Sergey Belov, Tatiana Goloskokova, Vladimir Korenkov, Nikolay Kutovskiy, Danila Oleynik, Artem Petrosyan, Roman Semenov, Alexander Uzhinskiy LIT JINR The.
SUN HPC Consortium, Heidelberg 2004 Grid(Lab) Resource Management System (GRMS) and GridLab Services Krzysztof Kurowski Poznan Supercomputing and Networking.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Cluster Computing through an Application-oriented Computational Chemistry Grid Kent Milfeld and Chona Guiang, Sudhakar Pamidighantam, Jim Giuliani Supported.
University of Virginia Experiences with NMI at the University of Virginia NMI Integration Testbed: Experiences in Middleware Deployment Spring 2003 Internet2.
National Center for Supercomputing Applications GridChem: Integrated Cyber Infrastructure for Computational Chemistry Sudhakar.
GridChem-- User Support Kent Milfeld Supported by the NSF NMI Program under Award # Oct. 10, 2005.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
WP9 Resource Management Current status and plans for future Juliusz Pukacki Krzysztof Kurowski Poznan Supercomputing.
Grid Computing, B. Wilkinson, b.1 National Science Foundation Middleware Initiative (NMI) Started in 2001 initially over 3 years “to create and deploy.
NeSC Apps Workshop July 20 th, 2002 Customizable command line tools for Grids Ian Kelley + Gabrielle Allen Max Planck Institute for Gravitational Physics.
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
ESP workshop, Sept 2003 the Earth System Grid data portal presented by Luca Cinquini (NCAR/SCD/VETS) Acknowledgments: ESG.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
National Computational Science National Center for Supercomputing Applications National Computational Science NCSA-IPG Collaboration Projects Overview.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
December 8 & 9, 2005, Austin, TX SURA Cyberinfrastructure Workshop Series: Grid Technology: The Rough Guide User Interfaces to Grids Patrick Hurley Texas.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks, An Overview of the GridWay Metascheduler.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
INFSO-RI Enabling Grids for E-sciencE Running ECCE on EGEE clusters Olav Vahtras KTH.
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
National Computational Science National Center for Supercomputing Applications National Computational Science Integration of the MyProxy Online Credential.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Biomedical Informatics Research Network The Storage Resource Broker & Integration with NMI Middleware Arcot Rajasekar, BIRN-CC SDSC October 9th 2002 BIRN.
Feb 2-4, 2004LNCC Workshop on Computational Grids & Apps Middleware for Production Grids Jim Basney Senior Research Scientist Grid and Security Technologies.
GridChem Architecture Overview Rion Dooley. Presentation Outline Computational Chemistry Grid (CCG) Current Architectural Overview CCG Future Architectural.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
GridChem Developers Conference Focus For Final Year Sudhakar Pamidighantam NCSA 25 August 2006.
GridChem Sciene Gateway and Challenges in Distributed Services Sudhakar Pamidighantam NCSA, University of Illinois at Urbaba- Champaign
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Computational chemistry with ECCE on EGEE.
National Computational Science Ky PACS at the University of Kentucky April 2000 –Advanced Computing Resources –EPSCoR Outreach –SURA Liaison –John.
A Desktop Client for HPC Chemistry Applications: GridChem Kent Milfeld Supported by the NSF NMI Program under Award #
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
GridChem Current Status
Example: Rapid Atmospheric Modeling System, ColoState U
Enable computational and experimental  scientists to do “more” computational chemistry by providing capability  computing resources and services at their.
Wide Area Workload Management Work Package DATAGRID project
Presentation transcript:

National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly Co-PIs: John Towns (NCSA/Univ of Illinois) Barbara Kucera (CCS/Univ of Kentucky) Steve Gordon (OSC) Kent Milfeld (TACC/Univ of Texas-Austin) Gabrielle Allen (CCT/LSU) Supported by the NSF NMI Program under Award #

National Center for Supercomputing Applications Overview Computational Chemistry Grid (CCG) –provide a collection of grid-based resources to routinely run chemical physics applications –to build a distributed infrastructure for open scientific research –focuses on an application space not requiring a high- speed network in its infrastructure –integrates a desktop environment into an infrastructure for a specific community of users computational chemists with both small and large scale needs experimental chemists who occasionally need simulation capabilities to verify experimental results

National Center for Supercomputing Applications Why the CCG? Provides production infrastructure to an amenable community of researchers –lowers barrier to use of significant computational resources for entire community Large center resources often difficult to use due to policies –computational chemistry applications typically run on relatively few processors for extended periods Leverages extant technologies –GridChem, Condor, GridFTP, GSI, … Integrates commonly used computational chemistry codes –Gaussian 98/03, GAMESS, MolPro, …

National Center for Supercomputing Applications Cyberinfrastructure Integration CCG Architecture GridChem Client Middleware Computational chemistry software Training and Outreach User Support Leveraged Technologies Metrics of Success

National Center for Supercomputing Applications CCG Architecture Three tiered architecture –client –middleware server –computational server

National Center for Supercomputing Applications GridChem Client Graphical user interface (GUI) –helps scientists generate input –Java desktop application –submit and monitor quantum chemistry jobs remotely –visualize output data Leverages internal development project at NCSA

National Center for Supercomputing Applications GridChem Client Architecture Composed of several modules –authentication –job-editor molecule builder visual molecular editor molecular fragment database crystal structure database –job submission –job manager job status info output monitoring and retrieval

National Center for Supercomputing Applications GridChem Integration User registration and adaptation to community allocations –integration of community authentication mechanisms currently support project allocations; straightforward extension –generalization of input file formats to support additional applications –updates for methods choices and algorithms –integration of deep analysis and three dimensional visualization software –application specific options integration

National Center for Supercomputing Applications Middleware Server Middleware interface to the computational grid –authentication –data management –resource specification –launch execution –provides client with job status information –provides access to job data for analysis input, output, job details stored in mass storage archive

National Center for Supercomputing Applications Lots of Middleware Leverage GRIDS Center software distribution –Condor, the Globus Toolkit, GRAM, GSI, MDS, NWS, MyProxy, GridConfig Tools, GridFTP, UberFTP… Condor-G –acting as a general interface and leveraging Condor as a general (meta-)scheduler

National Center for Supercomputing Applications Middleware Deployment/Integration Establish Middleware Server –support interface to grid computationa resources from GridChem client –to be located at NCSA Deploy middleware software and services to computational resources –base software install and configuration –incorporate advanced technologies resource brokerage, GridPort consider GAT (Grid Applications Toolkit)

National Center for Supercomputing Applications Computational Chemistry Applications Integration GridChem supports some apps already –Gaussian 98/03, GAMESS, MolPro Schedule of integration of additional software –NWChem –ACES-2 –Crystal –Q-Chem –NBO –Wein2K –MCCCS Towhee homegrown computational chemistry codes developed at LSU

National Center for Supercomputing Applications Training and Outreach Increase awareness through outreach –press releases and presentations Educate the community through training –access Grid-based live training –workshops –on-line courses use of modules in undergraduate and graduate courses

National Center for Supercomputing Applications Training and Outreach Integration Develop modules on a set of topics –interface fundamentals (e.g., inputs, choice lists, controls, etc.) authentication/authorizationmolecular builder job managerresource management post-processingvisualization integration of additional applications Provide as workshops and seminras –5th Annual Computational Chemistry Conference at the Univ of Kentucky, Fall 2005 Annual updates –track advancements and additional technologies developed

National Center for Supercomputing Applications User/Community Support Support provided by distributed set of staff involved in the project Problems tracking through single mechanism –Bugzilla to be set up for tracking and resolution Online documentation to be provided on the CCG website

National Center for Supercomputing Applications Leveraged Technologies GridChem –internally funded project at NCSA NMI GRIDS Center –lots of components there – Chemviz Spectral analysis module and Visualization Interfaces to Molden –Addin analysis and visualizations component for GridChem client

National Center for Supercomputing Applications Leveraged Resources Over 400 processors and 3,525,000 CPU hours available annually System (Site)Procs AvailTotal CPU Hours/Year HP Intel Cluster (OSC)12100,000 Intel Cluster (OSC)36315,000 Intel Cluster (UKy)96840,000 HP Integrity Superdome33290,000 Intel Cluster (NCSA)64560,000 SGI Origin2000 (NCSA)1281,000,000 Intel Cluster (LSU)32280,000 IBM Power4 (TACC)16140,000