Web services for Production Cyberenvironment for a A Computational Chemistry Grid University of Hyderabad, India 17 March 07 Sudhakar Pamidighantam NCSA,

Slides:



Advertisements
Similar presentations
National Center for Supercomputing Applications Production Cyberenvironment for a A Computational Chemistry Grid PRAGMA13, NCSA 26 Sep 07 Sudhakar Pamidighantam.
Advertisements

TeraGrid Deployment Test of Grid Software JP Navarro TeraGrid Software Integration University of Chicago OGF 21 October 19, 2007.
March 6 th, 2009 OGF 25 Unicore 6 and IPv6 readiness and IPv6 readiness
Pulan Yu School of Informatics Indiana University Bloomington Web service based Varuna.Net.
Abstraction Layers Why do we need them? –Protection against change Where in the hourglass do we put them? –Computer Scientist perspective Expose low-level.
Test harness and reporting framework Shava Smallen San Diego Supercomputer Center Grid Performance Workshop 6/22/05.
VAMDC Registry Portal Proof of Concept. Registry VAMDC Registry is available at – ex.jsp
GridChem and ParamChem: Science Gateways for Computational Chemistry (and More) Marlon Pierce, Suresh Marru Indiana University Sudhakar Pamidighantam NCSA.
BiodiversityWorld GRID Workshop NeSC, Edinburgh – 30 June and 1 July 2005 Resource wrappers, web services, grid services Jaspreet Singh School of Computer.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 CENTER FOR PARALLEL COMPUTERS An Introduction to Globus Toolkit® 3 -Developing Interoperable Grid services.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Data Grids: Globus vs SRB. Maturity SRB  Older code base  Widely accepted across multiple communities  Core components are tightly integrated Globus.
Peoplesoft: Building and Consuming Web Services
Simo Niskala Teemu Pasanen
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Slide 1 of 9 Presenting 24x7 Scheduler The art of computer automation Press PageDown key or click to advance.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Cluster Computing through an Application-oriented Computational Chemistry Grid Kent Milfeld and Chona Guiang, Sudhakar Pamidighantam, Jim Giuliani Supported.
GridChem Refactoring: Workflows in GridChem Sudhakar Pamidighantam
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
National Center for Supercomputing Applications GridChem: Integrated Cyber Infrastructure for Computational Chemistry Sudhakar.
TeraGrid Science Gateways: Scaling TeraGrid Access Aaron Shelmire¹, Jim Basney², Jim Marsteller¹, Von Welch²,
GridChem-- User Support Kent Milfeld Supported by the NSF NMI Program under Award # Oct. 10, 2005.
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting June 13-14, 2002.
Robert Fourer, Jun Ma, Kipp Martin Copyright 2006 An Enterprise Computational System Built on the Optimization Services (OS) Framework and Standards Jun.
GT Components. Globus Toolkit A “toolkit” of services and packages for creating the basic grid computing infrastructure Higher level tools added to this.
GridChem A Computational Chemistry Cyber-infrastructure Sudhakar Pamidighantam NCSA, University of Illinois at Urabana Champaign
Web Services Description Language (WSDL) Jason Glenn CDA 5937 Process Coordination in Service and Computational Grids September 30, 2002.
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
Web Services based e-Commerce System Sandy Liu Jodrey School of Computer Science Acadia University July, 2002.
© 2008 Open Grid Forum Independent Software Vendor (ISV) Remote Computing Primer Steven Newhouse.
National Center for Supercomputing Applications GridChem Science Gateway In Production National Science Foundation 23 May 2007 Sudhakar Pamidighantam NCSA,
WEB BASED DATA TRANSFORMATION USING XML, JAVA Group members: Darius Balarashti & Matt Smith.
Web Services BOF This is a proposed new working group coming out of the Grid Computing Environments Research Group, as an outgrowth of their investigations.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
Application portlets within the PROGRESS HPC Portal Michał Kosiedowski
GridChem A Computational Chemistry Cyber-infrastructure Using Web services Sanibel Symposium 23 Feb 07 Sudhakar Pamidighantam NCSA, University of Illinois.
Web Services. Abstract  Web Services is a technology applicable for computationally distributed problems, including access to large databases What other.
Communicating Security Assertions over the GridFTP Control Channel Rajkumar Kettimuthu 1,2, Liu Wantao 3,4, Frank Siebenlist 1,2 and Ian Foster 1,2,3 1.
Semantic Web Technologies Research Topics and Projects discussion Brief Readings Discussion Research Presentations.
 Apache Airavata Architecture Overview Shameera Rathnayaka Graduate Assistant Science Gateways Group Indiana University 07/27/2015.
TeraGrid Advanced Scheduling Tools Warren Smith Texas Advanced Computing Center wsmith at tacc.utexas.edu.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
IODE Ocean Data Portal - ODP  The objective of the IODE Ocean Data Portal (ODP) is to facilitate and promote the exchange and dissemination of marine.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
Migrating Desktop Bartek Palak Bartek Palak Poznan Supercomputing and Networking Center The Graphical Framework.
Kemal Baykal Rasim Ismayilov
ClearQuest XML Server with ClearCase Integration Northwest Rational User’s Group February 22, 2007 Frank Scholz Casey Stewart
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
Overview of Grid Webservices in Distributed Scientific Applications Dennis Gannon Aleksander Slominski Indiana University Extreme! Lab.
GridChem Architecture Overview Rion Dooley. Presentation Outline Computational Chemistry Grid (CCG) Current Architectural Overview CCG Future Architectural.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
GridChem Sciene Gateway and Challenges in Distributed Services Sudhakar Pamidighantam NCSA, University of Illinois at Urbaba- Champaign
The NGS Grid Portal David Meredith NGS + Grid Technology Group, e-Science Centre, Daresbury Laboratory, UK
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
A Desktop Client for HPC Chemistry Applications: GridChem Kent Milfeld Supported by the NSF NMI Program under Award #
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poland EGEE’08 Conference, Istanbul, 24 Sep.
GridChem Production Cyberenvironment for Computational Chemistry Pragma 12 Conference 21 March 07 Sudhakar Pamidighantam NCSA, University of Illinois at.
A System for Monitoring and Management of Computational Grids Warren Smith Computer Sciences Corporation NASA Ames Research Center.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
Architecture Review 10/11/2004
GPIR GridPort Information Repository
GridChem Current Status
Enable computational and experimental  scientists to do “more” computational chemistry by providing capability  computing resources and services at their.
Recap: introduction to e-science
Presentation transcript:

Web services for Production Cyberenvironment for a A Computational Chemistry Grid University of Hyderabad, India 17 March 07 Sudhakar Pamidighantam NCSA, University of Illinois at Urbana-Champaign

Acknowledgements

Outline Historical Background Grid Computational Chemistry Production Environments Current Status Web Services Usage Brief Demo Future

Motivation Software - Reasonably Mature and easy to use to address chemists questions of interest Community of Users - Need and capable of using the software Some are non traditional computational chemists Resources - Various in capacity and capability

Background Qauntum Chemistry Remote Job Monitor ( Quantum Chemistry Workbench) 1998, NCSA Chemviz , NSF (USA) Technologies Web Based Client Server Models Visual Interfaces Distributed computing (Condor)

GridChem NCSA Alliance was commissioned 1998 Diverse HPC systems deployed both at NCSA and Alliance Partner Sites Batch schedulers different at sites Policies favored different classes and modes of use at different sites/HPC systems

Extended TeraGrid Facility

NSF Petascale Road Map Track I Scheme Multi petaflop single site system to be deployed by 2010 Several Consortia Competing (Now under review) Track 2 Sub petaflop systems Several to be deployed until Track 1 is online First one will be at TACC ( 450 TFlops) Available Fall 2007 ( Processors/Cores) NCSA is deploying a 110 TFlops in April 2007 (10000 Processors/cores) Second subpetaflops systems being reviewed

Grid and Gridlock Alliance lead to Physical Grid Grid lead to TeraGrid Homogenous Grid with predefined fixed software and system stack was planned (Teragrid) but it was difficult to keep it homogenous Local preferences and diversity leads to heterogeneous grids now! (Operating Systems, Schedulers, Policies, Software and Services ) Openness and standards that lead interoperability are critical for successful services

Current Grid Status Grid Hardware Middleware Scientific Applications

User Community Chemistry and Computational Biology User Base Sep 03 – Oct 04 NRAC AAB Small Allocations #PIs #SUs 5,953,100 1,374, ,000

Some User Issues Addressed by the new Services New systems meant learning new commands Porting Codes Learning new job submissions and monitoring protocols New proposals for time (time for new proposals) Computational modeling became more popular and number of users increased (User Management) Batch queues are longer / waiting increased Finding resources where to compute complicated - probably multiple distributed sites Multiple proposals/allocations/logins Authentication and Data Security Data management

Computational Chemistry Grid This is a Virtual Organization Integrated Cyber Infrastructure for Computational Chemistry Integrates Applications, Middleware, HPC resources, Scheduling and Data management Allocations, User services and Training

Resources System (Site)Procs Avail Total CPU Hours/Year Status Intel Cluster (OSC)36315,000 SMP and Cluster nodes HP Integrity Superdome (UKy) 33290,000 TB Replaced with an SMP/ Cluster nodes IA32 Linux Cluster (NCSA) 64560,000 Allocated Intel Cluster (LSU)10241,000,000 Allocated IBM Power4 (TACC)16140,000 Allocated Teragrid (Multiple Institutions) ,000New Allocations Expected The initial Acesss Grid Testbed Nodes (38) and Condor SGI resources (NCSA, 512 nodes) have been retired this year.

Other Resources Extant HPC resources at various Supercomputer Centers (Interoperable) Optionally Other Grids and Hubs/local/personal resources These may require existing allocations/Authorization

Grid Middleware Proxy Server GridChem System user Portal Client Grid Services Grid applicationapplication Mass Storage

Applications GridChem supports some apps already –Gaussian, GAMESS, NWChem, Molpro, QMCPack, Amber Schedule of integration of additional software –ACES-3 –Crystal –Q-Chem –Wein2K –MCCCS Towhee –Others...

Gridchem Middleware Service (GMS)

GrdiChem Web Services Quick Primer XML is used to tag the data, SOAP is used to transfer the data, WSDL is used for describing the services available and UDDI is used for listing what services are available. Web Services is different from Web Page Systems or Web Servers: There is no GUI Web Services Share business logic, data & processes through APIs with each other (not with user) Web Services describe Standard way of interacting with “web based” applications A client program connecting to a web service can read the WSDL to determine what functions are available on the server. Any special datatypesdatatypes used are embedded in the WSDL file in the form of XML Schema. Universal Description, Discovery, and Integration. WSRF Standards Compliant.

GridChem Web Services Client  Objects  Database Interaction WS Resources DTO ObjectsHibernate Databasehb.xml Client DTO (Data Transfer Object) Serialize transfer through XML DAO (Data Access Object) How to get the DB objects hb.xml (Hibernate Data Map) describes obj/column data mapping Business Model DAO

GridChem Data Models UsersProjectsResources UserProjectResource SoftwareResources ComputeResources NetworkResources StorageResources Resources resoruceID Type hostName IPAddress siteID userID projectID resourceID loginName SUsLocalUserUsed Jobs jobID jobName userID projID softID cost UsersResources

Computational Chemistry Resource

GMS_WS Use Cases Authentication Job Submission Resource Monitoring Job Monitoring File Retrieval …

GMS_WS Authentication WSDL (Web Service Definition Language) is a language for describing how to interface with XML-based services. It describes network services as a pair of endpoints operating on messages with either document-oriented or procedure-oriented information. The service interface is called the port type WSDL FILE: <definitions name="MathService" targetNamespace=" xmlns=" … Contact GMS Creates Session, Session RP and EPR Sends EPR ( Like a Cookie, but more than that) Login Request (username:passwd) Validates, Loads UserProjects Sends acknowledgement Retrieve UserProjects (GetResourceProperty Port Type [PT]) GridChem ClientGMS

GMS_WS Authentication Selects project LoadVO port type (w. MAC address) Verifies user/project/MACaddr Load UserResources RP Retrieve UserResources [as userVO/ Profile] (GetResourceProperty port Type PT) GridChem ClientGMS Validates, Loads UserProjects Sends acknowledgement

GMS_WS Job Submission Create Job object PredictJobStartTime PT + JobDTO JobStart Prediction RP PT = portType RP = Resource Properties DTO = Data Transfer Object Completion: from batch system to GMS server  DB Submission CoGKit GAT “gsi-ssh” If decision OK, SubmitJob PT + JobDTO Create Job object API—Submit Store Job Object Send Acknowledgement Need to check to make sure allocation-time is available. GC ClientGMS

GMS_WS Monitoring Parse XML, Display PT = portType RP = Resource Properties DTO = Data Transfer Object DB = Data Base server Servers Job Launcher Notifications VO Admin parses  DB (status + cost) Request for Job, Resource Status Alloc. Balance UserResource RP Updated from DB GC ClientGMSResources/Kits/DB Send info Discover Applications (Software Resources) Monitor System Monitor Queues

GMS_WS Job Status Job Status jobDTO.status Job Launcher Status Update Estimate Start time Scheduler s/ notifications Notifications: Client, , IM GC ClientGMSResources/Kits/DB

GMS_WS File Retrieval (MSS) GetResourceProperty PT FileDTO(?) LoadFile PT (project folder+job) Validates project folder owned by user. Send new listing PT = portType RP = Resource Properties DTO = Data Transfer Object MSS = Mass Storage System Job Completion: Send Output to MSS LoadFile PT MSS query UserFiles RP + FileDTO object Retrieve Root Dir. Listing on MSS with CoGKit or GAT or “gsi-ssh” API file request Store locally Create FileDTO Load into UserData RP RetrieveFiles PT (+file rel.path) Retrieve file: CoGKit or GAT or “gsi-ssh” GetResourceProperty PT GC ClientGMSResources/Kits/DB

GMS_WS File Retrieval PT = portType RP = Resource Properties DTO = Data Transfer Object MSS = Mass Storage System Create FileDTO (?) Load into UserData RP RetrieveJobOutput PT (+JobDTO) Job Record from DB. Running: from Resource Complete: from MSS Retrieve file: CoGKit or GAT or “gsiftp” GetResourceProperty PT GC ClientGMSResources/Kits/DB

GridChem Web Services WSRF (Web Services Resource Framework) Compliant WSRF Specifications: WS-ResourceProperties (WSRF-RP) WS-ResourceLifetime (WSRF-RL) WS-ServiceGroup (WSRF-SG) WS-BaseFaults (WSRF-BF) %ps -aux | grep ws /usr/java/jdk1.5.0_05/bin/java \ -Dlog4j.configuration=container-log4j.properties \ -DGLOBUS_LOCATION=/usr/local/globus \ -Djava.endorsed.dirs=/usr/local/globus/endorsed \ -DGLOBUS_HOSTNAME=derrick.tacc.utexas.edu \ -DGLOBUS_TCP_PORT_RANGE=62500,64500 \ -Djava.security.egd=/dev/urandom \ -classpath /usr/local/globus/lib/bootstrap.jar: /usr/local/globus/lib/cog-url.jar: /usr/local/globus/lib/axis-url.jar org.globus.bootstrap.Bootstrap org.globus.wsrf.container.ServiceContainer -nosec Logging Configuration Where to find Globus Where to get random seed for encryption key generation Classpath (required jars)

GridChem Software Organization Open Source Distribution CVS for GridChem

Package: org.gridchem.service.gms GMS_WS

+ Should these each be a separate package?

model dto credential job notification filefile.task job.task user exceptions resource persistence synch query test util dao gpir crypt enumerators gat proxy GMS_WS client audit gms Classes for WSRF service implementation (PT) Cmd line tests to mimic client requests Data Access Obj – queries DB via persistent classes (hibernate) Data Transfer Obj – (job,File,Hardware,Software,User) XML How to handle errors (exceptions) CCG Service business mode (how to interact) Contains user’s credentials for job sub. file browsing,… “ Oversees correct” handling of user data (get/putfile). Define Job & util & enumerations (SubmitTask, KillTask,…) CCGResource&Util, Synched by GPIR, abstract classes NetworkRes., ComputeRes., SoftwareRes., StorageRes., VisualizationRes. User (has attributes – Preference/Address) DB operations (CRUD), OR Maps, pool mgmt,DB session, Classes that communicate with other web services Periodically update DB with GPIR info (GPIR calls) JUnit service test (gms.properties): authen. VO retrieval, Res.Query,Synch, Job Mgmt, File Mgmt, Notification Contains utility and singleton classes for the service. Encryption of login password Mapping from GMS_WS enumeration classes  DB GAT util classes: GATContext & GAT Preferences generation Classes deal with CoGKit configuration. Autonomous notification via , IM, textmesg.

GMS_WS external jars Testing For XML Parsing “Java” Document Object Model –Lightweight –Reading/Writing XML Docs –Complements SAX (parser) & DOM –Uses Collections**

Authentication

Resource Status

Job Editor

Job Submission

Job Monitoring

Gradient Monitoring

Energy Monitoring

Post Processing

Visualization Molecular Visualization Electronic Properties Spectra Vibrational Modes

Molecular Visualization Better molecule representations (Ball and Stick/VDW/MS) In Nanocad Molecular Editor Third party visualizer integration Chime/VMD Export Possibilities to others interfaces Deliver standard file formats (XML,SDF,MSF,Smiles etc…)

Eigen Function Visualization Molecular Orbital/Fragment Orbital MO Density Visualization MO Density Properties Other functions Radial distribution functions

Some example Visuals Arginine Gamess/6-31G* Total electronic density 2D - Slices

Electron Density in 3D Interactive (VRML)

Orbital 2D Displays N2 6-31g* Gamess

Orbital 3D VRML

Spectra IR/Raman Vibrotational Spectra UV Visible Spectra Spectra to Normal Modes Spectra to Orbitals

GridChem Use Allocation Community and External Registration Consulting/User Services Ticket tracking, Allocation Management Documentation Training and Outreach FAQ Extraction, Tutorials, Dissemination

Users and Usage 170 Users Include Academic PIs, two graduate classes And about 15 training users NCSA SUs + A 7 node dedicated system UKy around SUs OSC 13,820 SUs + A 14 node dedicated system Usage at LSU and TACC as well More than a CPU Wallhours since Jan 06.

Science Enabled Chemical Reactivity of the Biradicaloid (HO...ONO) Singlet States of Peroxynitrous Acid. The Oxidation of Hydrocarbons, Sulfides, and Selenides. Bach, R. D.; Dmitrenko, O.; Estévez, C. M. J. Am. Chem. Soc. 2005, 127, The "Somersault" Mechanism for the P-450 Hydroxylation of Hydrocarbons. The Intervention of Transient Inverted Metastable Hydroperoxides. Bach, R. D.; Dmitrenko, O. J. Am. Chem. Soc. 2006, 128(5), The Effect of Carbonyl Substitution on the Strain Energy of Small Ring Compounds and their Six-member Ring Reference Compounds Bach, R. D.; Dmitrenko, O. J. Am. Chem. Soc. 2006,128(14), 4598.

Science Enabled Azide Reactions for Controlling Clean Silicon Surface Chemistry: Benzylazide on Si(100)-2 1 Semyon Bocharov, Olga Dmitrenko, Lucila P. Mendez De Leo, and Andrew V. Teplyakov* Department of Chemistry and Biochemistry, UniVersity of Delaware, Newark, Delaware Received April 13, 2006; bin/asap.cgi/jacsat/asap/pdf/ja pdf [May require ACS access] bin/asap.cgi/jacsat/asap/pdf/ja pdf

Possible H-bonds network for P450 cam hydroperoxy intermediate Suggested: THR252 accepts an H-bond from the hydroperoxy (Fe(III)- OOH that promotes the second protonation on the distal oxygen, leading to the O-O bond cleavage Nagano, S.; Poulos, T.L. J. Biol. Chem. 2005, 250, p.1668 Auclair, K.; Hu, Z.; Little, D. M.; Ortiz de Montellano, P. R.; Groves, J. T. J. Am. Chem. Soc. 2002, 124, 6020.

The Somersault Isomerization of Model Cpd0 Robert Bach and Olga Dmytrenko, 2006

Energy Diagram for the Concerted Non-synchronous Hydroxylation of Isobutane Energy diagram (kcal/mol) for the oxidation of the isobutane with ground state, 24a (GS-8 hydrogen bonded to isobutane). MIN-24b [model oxidant MIN-10 (PorFe(SH)O  HO) hydrogen bonded to isobutene] is not necessarily on the reaction pathway.

Somersault Mechanism Summary for Isobutane Hydroxylation

Unsymmetrical Mo(CO) 4 Crown Ethers

Dibenzaphosphepin based  bis(phosphorous)polyether chelated Mo(CO) 4

Crystal Structures CSD:XAPZAP cis-(6,6'-((1,1'-Binaphthyl)-2,2'- diylbis(oxy))bis(dibenzo(d,f)(1,3,2)dioxaphosp hepin))-tetracarbonyl-molybdenum(0) C48 H28 Mo1 O10 P2 CSD:DEQDOS cis-Tetracarbonyl-(P,P'-(6-(2'-oxy-2-biphenyl)-3,6- dioxa-hexanolato)-bis(dibenzo (d,f)(1,3,2)dioxaphosphepine)-P,P')-molybdenum C44 H32 Mo1 O12 P2

Reference Structure for Comparison

Starting Structure

Optimized Structure

Reference Structure for Comparison 8 7

Structural Comparisons C-C Torsion Angles for the OCH 2 CH 2 O Fragments and for the Axially Chiral Biaryl Groups Atoms PCMODEL* UFF Ab Initio Amber C37-C42-C43-C C1-C6-C7-C C13-C22-C23-C C32-O-C33-C O-C33-C34-O C33-C34-O-C C34-O-C35-C O-C35-C *Hariharasarma, et al. Organomet., , Ab Initio=B3LYP/3-21G* Amber9 ff03, GAFF, chloroform, 300K, median over 1ns MD

MD OCH 2 CH 2 O Structure 8 7

MD Biaryl Structure

1 H NMR Chemical Shift Comparison For Aromatic Protons Reference 32ppm (from TMS B3LYP/6-31g*) Atom Exp.AbinitioAtomExp.Abinitio H H H H H H H H H H H H H H H H H H H H H H H H H H H H

Third Year Plans Post Processing Spectra and related entities New Application Support Aces3, Dmol3, Vasp Expansion of Resources Teragrid, OSG, Pragma Systems and New resources at Partner Sites Extension Plan Two Proposals in review for Extension

Future Plans Preparations for Petaflop computing High throughput massively parallel applications Complex workflows for integrating multiple interdependent applications Multiscale Computing Archiving and annotating data for future use Open Data initiatives by NIH and NSF

Acknowledgments Rion Dooley, TACC Middleware Infrastructure Stelios Kyriacou, OSC Middleware Scripts Chona Guiang, TACC Databases and Applications Kent Milfeld, TACC Database Integration Kailash Kotwani, NCSA, Applications and Middleware Scott Brozell, OSC, Applications and Testing Michael Sheetz, UKy, Application Interfaces Vikram Gazula, UKy, Server Administration Tom Roney, NCSA, Server and Database Maintenance