Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Center for Supercomputing Applications Production Cyberenvironment for a A Computational Chemistry Grid PRAGMA13, NCSA 26 Sep 07 Sudhakar Pamidighantam.

Similar presentations


Presentation on theme: "National Center for Supercomputing Applications Production Cyberenvironment for a A Computational Chemistry Grid PRAGMA13, NCSA 26 Sep 07 Sudhakar Pamidighantam."— Presentation transcript:

1 National Center for Supercomputing Applications Production Cyberenvironment for a A Computational Chemistry Grid PRAGMA13, NCSA 26 Sep 07 Sudhakar Pamidighantam NCSA, University of Illinois at Urbana-Champaign sudhakar@ncsa.edu

2 National Center for Supercomputing Applications Acknowledgements

3 National Center for Supercomputing Applications Outline Historical Background Grid Computational Chemistry Production Environments Current Status Web Services Usage (Grid and Science Achievements) Brief Demo Future

4 National Center for Supercomputing Applications Motivation Software - Reasonably Mature and easy to use to address chemists questions of interest Community of Users - Need and capable of using the software Some are non traditional computational chemists Resources - Various in capacity and capability

5 National Center for Supercomputing Applications Background Qauntum Chemistry Remote Job Monitor ( Quantum Chemistry Workbench) 1998, NCSA Chemviz 1999-2001, NSF (USA) http://chemviz.ncsa.uiuc.edu Technologies Web Based Client Server Models Visual Interfaces Distributed computing (Condor)

6 National Center for Supercomputing Applications GridChem NCSA Alliance was commissioned 1998 Diverse HPC systems deployed both at NCSA and Alliance Partner Sites Batch schedulers different at sites Policies favored different classes and modes of use at different sites/HPC systems

7 National Center for Supercomputing Applications Extended TeraGrid Facility www.teragrid.org

8 National Center for Supercomputing Applications NSF Petascale Road Map Track I Scheme Multi petaflop single site system to be deployed by 2010 Several Consortia Competing (Now under review) Track 2 Sub petaflop systems Several to be deployed until Track 1 is online First one will be at TACC ( 450 TFlops) Available Fall 2007 ( 50 000 Processors/Cores) NCSA is deploying a 110 TFlops in April 2007 (10000 Processors/cores) Second subpetaflops systems being reviewed

9 National Center for Supercomputing Applications Grid and Gridlock Alliance lead to Physical Grid Grid lead to TeraGrid Homogenous Grid with predefined fixed software and system stack was planned (Teragrid) but it was difficult to keep it homogenous Local preferences and diversity leads to heterogeneous grids now! (Operating Systems, Schedulers, Policies, Software and Services ) Openness and standards that lead interoperability are critical for successful services

10 National Center for Supercomputing Applications Current Grid Status Grid Hardware Middleware Scientific Applications

11 National Center for Supercomputing Applications User Community Chemistry and Computational Biology User Base Sep 03 – Oct 04 NRAC AAB Small Allocations ------------------------------------------------------------- #PIs 26 23 64 #SUs 5,953,100 1,374,100 640,000

12 National Center for Supercomputing Applications

13 Some User Issues Addressed by the new Services New systems meant learning new commands Porting Codes Learning new job submissions and monitoring protocols New proposals for time (time for new proposals) Computational modeling became more popular and number of users increased (User Management) Batch queues are longer / waiting increased Finding resources where to compute complicated - probably multiple distributed sites Multiple proposals/allocations/logins Authentication and Data Security Data management

14 National Center for Supercomputing Applications Computational Chemistry Grid This is a Virtual Organization Integrated Cyber Infrastructure for Computational Chemistry Integrates Applications, Middleware, HPC resources, Scheduling and Data management Allocations, User services and Training

15 National Center for Supercomputing Applications Resources System (Site)Procs Avail Total CPU Hours/Year Status Intel Cluster (OSC)36315,000 SMP and Cluster nodes HP Integrity Superdome (UKy) 33290,000 TB Replaced with an SMP/ Cluster nodes IA32 Linux Cluster (NCSA) 64560,000 Allocated Intel Cluster (LSU)10241,000,000 Allocated IBM Power4 (TACC)16140,000 Allocated Teragrid (Multiple Institutions) 2-10000 250,000New Allocations Expected The initial Acesss Grid Testbed Nodes (38) and Condor SGI resources (NCSA, 512 nodes) have been retired this year.

16 National Center for Supercomputing Applications Other Resources Extant HPC resources at various Supercomputer Centers (Interoperable) Optionally Other Grids and Hubs/local/personal resources These may require existing allocations/Authorization

17 National Center for Supercomputing Applications

18 Grid Middleware Proxy Server GridChem System user Portal Client Grid Services Grid applicationapplication Mass Storage http:// www.nsf.gov/awardsearch/showAward.do?AwardNumber=0438312

19 National Center for Supercomputing Applications Applications GridChem supports some apps already –Gaussian, GAMESS, NWChem, Molpro, ADF,QMCPack, Amber Schedule of integration of additional software –ACES-3 –Crystal –Q-Chem –Wein2K –MCCCS Towhee –Others...

20 National Center for Supercomputing Applications GridChem User Services Allocation Request https://www.gridchem.org/allocations/comm_form.php

21 National Center for Supercomputing Applications GridChem User Services Consulting Ticketing System User View

22 National Center for Supercomputing Applications GridChem User Services Consulting Ticketing System https://www.gridchem.org/consult/ Consultants View

23 National Center for Supercomputing Applications Gridchem Middleware Service (GMS)

24 National Center for Supercomputing Applications GrdiChem Web Services Quick Primer XML is used to tag the data, SOAP is used to transfer the data, WSDL is used for describing the services available and UDDI is used for listing what services are available. Web Services is different from Web Page Systems or Web Servers: There is no GUI Web Services Share business logic, data & processes through APIs with each other (not with user) Web Services describe Standard way of interacting with web based applications A client program connecting to a web service can read the WSDL to determine what functions are available on the server. Any special datatypesdatatypes used are embedded in the WSDL file in the form of XML Schema. Universal Description, Discovery, and Integration. WSRF Standards Compliant.

25 National Center for Supercomputing Applications GridChem Web Services Client Objects Database Interaction WS Resources DTO ObjectsHibernate Databasehb.xml Client DTO (Data Transfer Object) Serialize transfer through XML DAO (Data Access Object) How to get the DB objects hb.xml (Hibernate Data Map) describes obj/column data mapping Business Model DAO

26 National Center for Supercomputing Applications GridChem Data Models UsersProjectsResources UserProjectResource SoftwareResources ComputeResources NetworkResources StorageResources Resources resoruceID Type hostName IPAddress siteID userID projectID resourceID loginName SUsLocalUserUsed Jobs jobID jobName userID projID softID cost UsersResources

27 National Center for Supercomputing Applications Computational Chemistry Resource

28 National Center for Supercomputing Applications GMS_WS Use Cases Authentication Job Submission Resource Monitoring Job Monitoring File Retrieval … http://www.gridchem.org:8668/space/GMS/usecase

29 National Center for Supercomputing Applications GetResourceProperty SetTerminationTime Destroy Create Login LoadVO RetrieveFiles LoadFiles DeleteFiles LoadParentFiles RefreshFiles MakeDirectory SubmitJob SubmitMultipleJobs PredictJobStartTime KillJob HideJob UnhideJob UnhideJobs DeleteJob FindJobs GetJobStatus RetrieveJobOutput RetrieveNextDataBlock StopFileAction GetUserPreferences PutUserPreferences GridChem Web Services Operations

30 National Center for Supercomputing Applications GMS_WS Authentication WSDL (Web Service Definition Language) is a language for describing how to interface with XML-based services. It describes network services as a pair of endpoints operating on messages with either document-oriented or procedure-oriented information. The service interface is called the port type WSDL FILE: <definitions name=GMS" targetNamespace=http://www.gridchem.org/gms " xmlns="http://schemas.xmlsoap.org/wsdl/" …http://www.gridchem.org/gms http://www.gridchem.org:8668/space/GMS/usecase Contact GMS Creates Session, Session RP and EPR Sends EPR ( Like a Cookie, but more than that) Login Request (username:passwd) Validates, Loads UserProjects Sends acknowledgement Retrieve UserProjects (GetResourceProperty Port Type [PT]) GridChem ClientGMS

31 National Center for Supercomputing Applications GMS_WS Authentication http://www.gridchem.org:8668/space/GMS/usecase Selects project LoadVO port type (w. MAC address) Verifies user/project/MACaddr Load UserResources RP Retrieve UserResources [as userVO/ Profile] (GetResourceProperty port Type PT) GridChem ClientGMS Validates, Loads UserProjects Sends acknowledgement

32 National Center for Supercomputing Applications GMS_WS Job Submission Create Job object PredictJobStartTime PT + JobDTO JobStart Prediction RP PT = portType RP = Resource Properties DTO = Data Transfer Object Completion: Email from batch system to GMS server cron@GMS DB Submission CoGKit GAT gsi-ssh If decision OK, SubmitJob PT + JobDTO Create Job object APISubmit Store Job Object Send Acknowledgement Need to check to make sure allocation-time is available. GC ClientGMS

33 National Center for Supercomputing Applications GMS_WS Monitoring Parse XML, Display PT = portType RP = Resource Properties DTO = Data Transfer Object DB = Data Base cron@GMS server cron@HPC Servers Job Launcher Notifications VO Admin email parses email DB (status + cost) Request for Job, Resource Status Alloc. Balance UserResource RP Updated from DB GC ClientGMSResources/Kits/DB Send info Discover Applications (Software Resources) Monitor System Monitor Queues

34 National Center for Supercomputing Applications GMS_WS Job Status Job Status jobDTO.status Job Launcher Status Update Estimate Start time Scheduler emails/ notifications Notifications: Client, email, IM GC ClientGMSResources/Kits/DB

35 National Center for Supercomputing Applications GMS_WS File Retrieval (MSS) GetResourceProperty PT FileDTO(?) LoadFile PT (project folder+job) Validates project folder owned by user. Send new listing PT = portType RP = Resource Properties DTO = Data Transfer Object MSS = Mass Storage System Job Completion: Send Output to MSS LoadFile PT MSS query UserFiles RP + FileDTO object Retrieve Root Dir. Listing on MSS with CoGKit or GAT or gsi-ssh API file request Store locally Create FileDTO Load into UserData RP RetrieveFiles PT (+file rel.path) Retrieve file: CoGKit or GAT or gsi-ssh GetResourceProperty PT GC ClientGMSResources/Kits/DB

36 National Center for Supercomputing Applications GMS_WS File Retrieval PT = portType RP = Resource Properties DTO = Data Transfer Object MSS = Mass Storage System Create FileDTO (?) Load into UserData RP RetrieveJobOutput PT (+JobDTO) Job Record from DB. Running: from Resource Complete: from MSS Retrieve file: CoGKit or GAT or gsiftp GetResourceProperty PT GC ClientGMSResources/Kits/DB

37 National Center for Supercomputing Applications GridChem Web Services WSRF (Web Services Resource Framework) Compliant WSRF Specifications: WS-ResourceProperties (WSRF-RP) WS-ResourceLifetime (WSRF-RL) WS-ServiceGroup (WSRF-SG) WS-BaseFaults (WSRF-BF) %ps -aux | grep ws /usr/java/jdk1.5.0_05/bin/java \ -Dlog4j.configuration=container-log4j.properties \ -DGLOBUS_LOCATION=/usr/local/globus \ -Djava.endorsed.dirs=/usr/local/globus/endorsed \ -DGLOBUS_HOSTNAME=derrick.tacc.utexas.edu \ -DGLOBUS_TCP_PORT_RANGE=62500,64500 \ -Djava.security.egd=/dev/urandom \ -classpath /usr/local/globus/lib/bootstrap.jar: /usr/local/globus/lib/cog-url.jar: /usr/local/globus/lib/axis-url.jar org.globus.bootstrap.Bootstrap org.globus.wsrf.container.ServiceContainer -nosec Logging Configuration Where to find Globus Where to get random seed for encryption key generation Classpath (required jars)

38 National Center for Supercomputing Applications GridChem Software Organization Open Source Distribution CVS for GridChem

39 National Center for Supercomputing Applications Package: org.gridchem.service.gms GMS_WS

40 National Center for Supercomputing Applications GMS_WS + Should these each be a separate package?

41 National Center for Supercomputing Applications model dto credential job notification filefile.task job.task user exceptions resource persistence synch query test util dao gpir crypt enumerators gat proxy GMS_WS client audit gms Classes for WSRF service implementation (PT) Cmd line tests to mimic client requests Data Access Obj – queries DB via persistent classes (hibernate) Data Transfer Obj – (job,File,Hardware,Software,User) XML How to handle errors (exceptions) CCG Service business mode (how to interact) Contains users credentials for job sub. file browsing,… Oversees correct handling of user data (get/putfile). Define Job & util & enumerations (SubmitTask, KillTask,…) CCGResource&Util, Synched by GPIR, abstract classes NetworkRes., ComputeRes., SoftwareRes., StorageRes., VisualizationRes. User (has attributes – Preference/Address) DB operations (CRUD), OR Maps, pool mgmt,DB session, Classes that communicate with other web services Periodically update DB with GPIR info (GPIR calls) JUnit service test (gms.properties): authen. VO retrieval, Res.Query,Synch, Job Mgmt, File Mgmt, Notification Contains utility and singleton classes for the service. Encryption of login password Mapping from GMS_WS enumeration classes DB GAT util classes: GATContext & GAT Preferences generation Classes deal with CoGKit configuration. Autonomous notification via email, IM, textmesg.

42 National Center for Supercomputing Applications GMS_WS external jars Testing For XML Parsing Java Document Object Model –Lightweight –Reading/Writing XML Docs –Complements SAX (parser) & DOM –Uses Collections**

43 National Center for Supercomputing Applications GridChem Resources Monitoring http://portal.gridchem.org:8080/gridsphere/gridsphere?cid=home

44 National Center for Supercomputing Applications GridChem Resources New Computing Systems SystemCapacity(Cpus/Cores)Capability Mercury(NCSA)1774Small/Large Parallel Runs Abe(NCSA)9600Massively Parallel Runs DataStar(SDSC)2368SharedMemory Large Runs Bluegene/L(SDSC)3456Cluster Large Parallel Runs TeragridCluster(SDSC)564Small/Large Parallel Runs BigRed(IU)1024SharedMemory Small/Large Runs BCX (UKy)1360Shared/Distributed Memory small/Large Parallel Runs

45 National Center for Supercomputing Applications Application Software Resources Currently Supported SuiteVersionLocation Gaussian 03C.02/D.01Many Platforms MolPro2006.1NCSA NWChem5.0/4.7Many Platforms GamessJan 06Many Platforms Amber8.0Many Paltforms QMCPack2.0NCSA

46 National Center for Supercomputing Applications GridChem Software Resources New Applications Integration Underway ADF Amsterdam Density Functional Theory Wien2K Linearized Augemented Plain wave (DFT) CPMD Car Parinello Molecular Dynamics QChem Molecular Energetics (Quantum Chemistry) Aces3 Parallel Coupled Cluster Quantum Chemistry Gromacs Nano/Bio Simulations (Molecular Dynamics) NAMD Molecular Dynamics DMol3 Periodic Molecular Systems ( Quantum Chemistry) Castep Quantum Chemistry MCCCS-Towhee Molecular Confirmation Sampling (Monte Carlo) Crystal98/06 Crystal Optimizations (Quantum Chemistry) ….

47 National Center for Supercomputing Applications GridChem User Services Allocation https://www.gridchem.org/allocations/index.shtml Community and External Registration Reviews, PI Registration and Access Creation Community User Norms Established Consulting/User Services https://www.gridchem.org/consult Ticket tracking, Allocation Management Documentation, Training and Outreach https://www.gridchem.org/doc_train/index.shtml FAQ Extraction, Tutorials, Dissemination Help is integrated into the GridChem client

48 National Center for Supercomputing Applications Users and Usage 242 Users under 128 Projects Include Academic PIs, two graduate classes And about 15 training users More than a 442000 CPU Wallhours since Jan 06 More than 10000 Jobs processed

49 National Center for Supercomputing Applications Science Enabled Azide Reactions for Controlling Clean Silicon Surface Chemistry: Benzylazide on Si(100)-2 x 1 Semyon Bocharov et al.. J. Am. Chem. Soc., 128 (29), 9300 -9301, 2006 Chemistry of Diffusion Barrier Film Formation: Adsorption and Dissociation of Tetrakis(dimethylamino)titanium on Si(100)-2 × 1 Rodriguez-Reyes, J. C. F.; Teplyakov, A. V. J. Phys. Chem. C.; 2007; 111(12); 4800-4808. Computational Studies of [2+2] and [4+2] Pericyclic Reactions between Phosphinoboranes and Alkenes. Steric and Electronic Effects in Identifying a Reactive Phosphinoborane that Should Avoid Dimerization Thomas M. Gilbert* and Steven M. Bachrach Organometallics, 26 (10), 2672 -2678, 2007.*

50 National Center for Supercomputing Applications Science Enabled Chemical Reactivity of the Biradicaloid (HO...ONO) Singlet States of Peroxynitrous Acid. The Oxidation of Hydrocarbons, Sulfides, and Selenides. Bach, R. D et al. J. Am. Chem. Soc. 2005, 127, 3140-3155. The "Somersault" Mechanism for the P-450 Hydroxylation of Hydrocarbons. The Intervention of Transient Inverted Metastable Hydroperoxides. Bach, R. D.; Dmitrenko, O. J. Am. Chem. Soc. 2006, 128(5), 1474-1488. The Effect of Carbonyl Substitution on the Strain Energy of Small Ring Compounds and their Six-member Ring Reference Compounds Bach, R. D.; Dmitrenko, O. J. Am. Chem. Soc. 2006,128(14), 4598.

51 National Center for Supercomputing Applications GridChem Client Download Statistics http://download.gridchem.org/usage/

52 National Center for Supercomputing Applications Distribution of GridChem User Community

53 National Center for Supercomputing Applications Job Distribution

54 National Center for Supercomputing Applications System Wide Usage HPC SystemUsage (SUs) Tungsten(NCSA)5507 Copper(NCSA)86484 CCGcluster(NCSA)55709 Condor(NCSA)30 SDX(UKy)116143 CCGCluster(UKy).5 Longhorn(TACC)54 CCGCluster(OSC)62000 TGCluster(OSC)36936 Cobalt(NCSA)2485 Champion(TACC)11 Mike4 (LSU)14537

55 National Center for Supercomputing Applications GridChem Client Enhancements New Molecular Editor JMolEditor (ANU) Integration VMD Is integrated Nanotube Generator (Tubegen) Will be available Gamess Graphical User Interphase

56 National Center for Supercomputing Applications Java Molecular Editor JMolEditor Three Dimensional Visual with Java 3D Intuitive Molecule Manipulation Interactive Bond, Angle and Dihedral Settings A Gaussian input generator Interface

57 National Center for Supercomputing Applications Nanotube Generator:Tubegen Courtesy : Doren Research Group at the University of Delaware Crystal Cell Types Output Formats

58 National Center for Supercomputing Applications GridChem Gamess GUI

59 National Center for Supercomputing Applications GridChem Post Processing IR/Raman Spectra now accessible from G03, MolPro, NWChem and Gamess Suites VCD/ROA To be Included

60 National Center for Supercomputing Applications GridChem Post Processing Normal Mode Viewing in 3D VRML Other Spectra With MO Integration NMR Electronic Spectra

61 National Center for Supercomputing Applications GridChem Usability Dynamic Information

62 National Center for Supercomputing Applications GridChem Usability Information on Potential Start and End Time for a given set of Job parameters Automated Resource Selection Possible Job Migration In case of dropped nodes or incomplete job Monitoring Multiple Jobs Automated Monitoring Job Output

63 National Center for Supercomputing Applications Implementation of GRMS resource management Service http://www.gridlab.org/WorkPackages/wp-9 http://www.gridlab.org/WorkPackages/wp-9 Moving toward Service based job submission eliminating gateway interfaces Infrastructure for multiple input files for single application Infrastructure for multiple inputs in High Throughput processing Integrated workflow for multi scale coupled modeling Meta-scheduling for High Throughput Processing Match Making, Round-robin scheduling, Preferred Host Set usage GridChem Middleware Infrastructure Implementation Currently underway

64 National Center for Supercomputing Applications GridChem In New Collaborations Resource Providers New Resource Providers Open Science Grid Initially for Bio-related applications (open source preferably) PRAGMA Partner sites University of Hyderabad ORNL (Could be via TeraGrid) International Partners KISTI, APAC, Daresbury Labs

65 National Center for Supercomputing Applications Scientific Collaborations GridChem Extension to Molecular Sciences (Bio, Nano, Geo and Materials Sciences) (NSF Proposal) Parameter Sweep for Potential Energy Hyper Surfaces (Faculty Fellows, NCSA) Automated Parameterization of Force fields (NSF Proposal) Ab initio Molecular Dynamics (Faculty Fellows, NCSA) Education (CI-TEAM) (NSF Proposals) Multi-Scale Modeling (IACAT, UIUC )

66 National Center for Supercomputing Applications Some New GridChem Infrastructure Workflow Editors Coupled Application Execution Large Scale Computing Metadata and Archiving Rich Client Platform Refactorization Intergrid Interactions Open Source Distribution http://cvs.gridchem.org/cvs/ Open Architecture and Implementation details http://www.gridchem.org/wiki

67 National Center for Supercomputing Applications Critical Gateways Issues Science Gateways compete with business as usual for the end user research scientist No direct access to HPC systems may be possible leading to apparent lack of control for users No End to end solutions If part of the research needs require old ways Gateways may be avoided Learning to use Gateways should provide substantial added benefit –Cost/Benefit Issues for users Flexibility to integrate new applications as needed by community quickly is critical to keep the user community engaged

68 National Center for Supercomputing Applications Authentication

69 National Center for Supercomputing Applications Resource Status

70 National Center for Supercomputing Applications Job Editor

71 National Center for Supercomputing Applications Job Submission

72 National Center for Supercomputing Applications Job Monitoring

73 National Center for Supercomputing Applications Gradient Monitoring

74 National Center for Supercomputing Applications Energy Monitoring

75 National Center for Supercomputing Applications Post Processing

76 National Center for Supercomputing Applications Visualization Molecular Visualization Electronic Properties Spectra Vibrational Modes

77 National Center for Supercomputing Applications Molecular Visualization Better molecule representations (Ball and Stick/VDW/MS) In Nanocad Molecular Editor Third party visualizer integration Chime/VMD Export Possibilities to others interfaces Deliver standard file formats (XML,SDF,MSF,Smiles etc…)

78 National Center for Supercomputing Applications Eigen Function Visualization Molecular Orbital/Fragment Orbital MO Density Visualization MO Density Properties Other functions Radial distribution functions

79 National Center for Supercomputing Applications Some example Visuals Arginine Gamess/6-31G* Total electronic density 2D - Slices

80 National Center for Supercomputing Applications Electron Density in 3D Interactive (VRML)

81 National Center for Supercomputing Applications Orbital 2D Displays N2 6-31g* Gamess

82 National Center for Supercomputing Applications Orbital 3D VRML

83 National Center for Supercomputing Applications Spectra IR/Raman Vibrotational Spectra UV Visible Spectra Spectra to Normal Modes Spectra to Orbitals

84 National Center for Supercomputing Applications Possible H-bonds network for P450 cam hydroperoxy intermediate Suggested: THR252 accepts an H-bond from the hydroperoxy (Fe(III)- OOH that promotes the second protonation on the distal oxygen, leading to the O-O bond cleavage Nagano, S.; Poulos, T.L. J. Biol. Chem. 2005, 250, p.1668 Auclair, K.; Hu, Z.; Little, D. M.; Ortiz de Montellano, P. R.; Groves, J. T. J. Am. Chem. Soc. 2002, 124, 6020.

85 National Center for Supercomputing Applications The Somersault Isomerization of Model Cpd0 Robert Bach and Olga Dmytrenko, 2006

86 National Center for Supercomputing Applications Energy Diagram for the Concerted Non-synchronous Hydroxylation of Isobutane Energy diagram (kcal/mol) for the oxidation of the isobutane with ground state, 24a (GS-8 hydrogen bonded to isobutane). MIN-24b [model oxidant MIN-10 (PorFe(SH)O HO) hydrogen bonded to isobutene] is not necessarily on the reaction pathway.

87 National Center for Supercomputing Applications Somersault Mechanism Summary for Isobutane Hydroxylation

88 National Center for Supercomputing Applications TetrakisDimethylAminoTitanium and its derivatives on Si(100)- 2x1 Surface: Diffusion Barrier Thinfilms on Silicon Rodrigues-Reyes and Teplyakov

89 National Center for Supercomputing Applications Benzylazide on Si(100)-2x1 Surface Deposition of Aromatic Moieties on Silicon for Lateral Electron Transfer Bocharov et al..

90 National Center for Supercomputing Applications [2+2] Cyclo Additions involving B=P Bonds Gilbert and Bachrach Dimerization Ethyne Addition Ethene Additions

91 National Center for Supercomputing Applications Possible H-bonds network for P450 cam hydroperoxy intermediate Suggested: THR252 accepts an H-bond from the hydroperoxy (Fe(III)- OOH that promotes the second protonation on the distal oxygen, leading to the O-O bond cleavage Nagano, S.; Poulos, T.L. J. Biol. Chem. 2005, 250, p.1668 Auclair, K.; Hu, Z.; Little, D. M.; Ortiz de Montellano, P. R.; Groves, J. T. J. Am. Chem. Soc. 2002, 124, 6020.

92 National Center for Supercomputing Applications The Somersault Isomerization of Model Cpd0 Robert Bach and Olga Dmytrenko, 2006

93 National Center for Supercomputing Applications Energy Diagram for the Concerted Non-synchronous Hydroxylation of Isobutane Energy diagram (kcal/mol) for the oxidation of the isobutane with ground state, 24a (GS-8 hydrogen bonded to isobutane). MIN-24b [model oxidant MIN-10 (PorFe(SH)O HO) hydrogen bonded to isobutene] is not necessarily on the reaction pathway.

94 National Center for Supercomputing Applications Somersault Mechanism Summary for Isobutane Hydroxylation

95 National Center for Supercomputing Applications Unsymmetrical Mo(CO) 4 Crown Ethers

96 National Center for Supercomputing Applications Dibenzaphosphepin based bis(phosphorous)polyether chelated Mo(CO) 4

97 National Center for Supercomputing Applications Crystal Structures CSD:XAPZAP cis-(6,6'-((1,1'-Binaphthyl)-2,2'- diylbis(oxy))bis(dibenzo(d,f)(1,3,2)dioxaphosp hepin))-tetracarbonyl-molybdenum(0) C48 H28 Mo1 O10 P2 CSD:DEQDOS cis-Tetracarbonyl-(P,P'-(6-(2'-oxy-2-biphenyl)-3,6- dioxa-hexanolato)-bis(dibenzo (d,f)(1,3,2)dioxaphosphepine)-P,P')-molybdenum C44 H32 Mo1 O12 P2

98 National Center for Supercomputing Applications Starting Structure

99 National Center for Supercomputing Applications Optimized Structure

100 National Center for Supercomputing Applications Reference Structure for Comparison 8 7

101 National Center for Supercomputing Applications Structural Comparisons C-C Torsion Angles for the OCH 2 CH 2 O Fragments and for the Axially Chiral Biaryl Groups Atoms PCMODEL* UFF Ab Initio Amber C37-C42-C43-C48 -49.9 -26.4 -43.0 -40.4 C1-C6-C7-C12 45.4 22.3-22.3 -72.8 C13-C22-C23-C32 75.6 74.7-85.9 -81.2 C32-O-C33-C34 -178.4 -140.8 159.7 -171.2 O-C33-C34-O 62.4 -64.5 -87.3 -82.4 C33-C34-O-C35 -80.6 -118.9 67.8 64.9 C34-O-C35-C36 174.6 118.9 -153.4 60.1 O-C35-C36-0 66.2 56.0 64.0 67.3 *Hariharasarma, et al. Organomet., 1232-1238, 2000. Ab Initio=B3LYP/3-21G* Amber9 ff03, GAFF, chloroform, 300K, median over 1ns MD

102 National Center for Supercomputing Applications MD OCH 2 CH 2 O Structure 8 7

103 National Center for Supercomputing Applications MD Biaryl Structure

104 National Center for Supercomputing Applications 1 H NMR Chemical Shift Comparison For Aromatic Protons Reference 32ppm (from TMS B3LYP/6-31g*) Atom Exp.AbinitioAtomExp.Abinitio H27.0255.6H256.5785.7 H37.0265.8H266.7375.9 H47.0495.9H277.0186.1 H57.1816.0H287.6236.5 H87.1106.1H307.7906.7 H96.8906.0H317.2896.9 H106.7216.0 H116.2375.7H387.3276.2 H397.2746.1 H147.9255.8H407.1696.0 H157.8086.3H417.3506.3 H177.7416.0H447.3606.1 H187.2545.6H457.1605.9 H197.0915.1H467.1766.0 H206.9894.6H477.0607.0

105 National Center for Supercomputing Applications Third Year Plans Post Processing Spectra and related entities New Application Support Aces3, Dmol3, Vasp,….. Expansion of Resources Teragrid, OSG, Pragma Systems and New resources at Partner Sites Extension Plan Two Proposals in review for Extension

106 National Center for Supercomputing Applications Future Plans Preparations for Petaflop computing High throughput massively parallel applications Complex workflows for integrating multiple interdependent applications Multiscale Computing Archiving and annotating data for future use Open Data initiatives by NIH and NSF

107 National Center for Supercomputing Applications Acknowledgments Rion Dooley, TACC Middleware Infrastructure Stelios Kyriacou, OSC Middleware Scripts Chona Guiang, TACC Databases and Applications Kent Milfeld, TACC Database Integration Kailash Kotwani, NCSA, Applications and Middleware Scott Brozell, OSC, Applications and Testing Michael Sheetz, UKy, Application Interfaces Vikram Gazula, UKy, Server Administration Tom Roney, NCSA, Server and Database Maintenance


Download ppt "National Center for Supercomputing Applications Production Cyberenvironment for a A Computational Chemistry Grid PRAGMA13, NCSA 26 Sep 07 Sudhakar Pamidighantam."

Similar presentations


Ads by Google