Download presentation
Presentation is loading. Please wait.
Published byBenjamin Davis Modified over 8 years ago
1
Tools and methods for multiscale biomolecular simulations We represent a partnership of 7 researchers located at the 3 Universities and the National Institute of Environmental and Health Sciences (NIEHS), all located within North Carolina’s Research Triangle NC State Members: Jerry Bernholc, Lubos Mitas, Christopher Roland and Celeste Sagui (PI) – Physics UNC Member: Lee Pedersen – Chemistry and NIEHS Duke Member: John Board – Computer Science NIEHS Collaborator: – Tom Darden – Structural Biology Lab Celeste Sagui Department of Physics, NC State University, Raleigh NC
2
ITR Scientific Aims explore science as to enable a set of scalable computational tools for large-scale, multiscale biomolecular simulations multiscale methods are to range from Quantum Monte Carlo (QMC) to continuum methods codes will be based on real-space grids, with multigrid acceleration and convergence electrostatics will be treated in a highly efficient and accurate manner codes will be used to solve paradigmatic biomolecular problems codes will ultimately be distributed under the Open Source GPL license
3
Some Highlights of Current Progress 1. Accurate and efficient electrostatics for large- scale biomolecular simulations (Sagui, Darden, Roland) 2. Coupling of QMC and MD ( Mitas) 3. Thomas Fermi and DFT (Bernholc) 4. PMEMD and AMBER 8.0 (Pedersen, Duke) 5. Applications
5
Classical Molecular Dynamics Developments: Accurate Electrostatics
6
Accurate and Efficient Electrostatics for Large-Scale Biomolecular Simulations Accurate electrostatics is absolutely essential for meaningful biomolecular simulations ( i.e., they stabilize the delicate 3-d structures, bind complexes together, represent the computational bottleneck in current simulations, etc) Key Challenges: (a) More accurate description of electrostatics with higher-order multipoles is needed ( Our solution: Wannier functions ) (b) Computationally efficient ways of simulating such systems are needed ( Our solution: advanced PME/ Multigrid approaches)
7
Limitations of Current Modeling of Electrostatic Fields Higher-order multipoles have not been implemented due to their overwhelming costs: 1.Multipoles up to hexadecapoles have 35 degrees of freedom, so that interaction matrix between them has 1225 components i.e., cost of fixed cutoff implementation is 3 orders of magnitude more than just charges alone !! 2.Ewald implementation grows like O(N 2 ) 3. Use of cutoffs alleviate the problem: WRONG !! much of the cost originates in the direct part truncation leads to artifactual behavior unless cutoffs of the order of 25 Å are used
8
Sagui, Pedersen, and Darden, J. Chem. Phys. 120, 73 (2004) Our Solution 1.Implement a McMurchie-Davidson formalism for the direct part of the Ewald summation 2. Switch most of the calculation to reciprocal space 3. Implement a Particle-Mesh Ewald (PME)-based approach for single-processor machines 4. Implement a multigrid-based approach for parallel machines
9
(Ang -1 ) R c (Ang) Spline Order h x (Ang) Direct (sec) Recip rocal (sec) Overall (sec) charges0.505.6350.7750.210.180.42 dipoles0.505.6050.7750.330.210.58 quadrupoles0.555.1060.6680.570.320.96 octupoles0.704.2580.6200.930.601.70 hexadecapoles0.853.6080.4591.541.123.05 Relative RMS force error: 5x10 -4 ; for error 5x10 -5, hexadecapole cost is 4.4 secs; 5x10 -6 – cost is 5 secs With R c =8 Å cutoff, costs is 6 times more than with PME and has a RMS error of about 0.05 PME-based results for 4096 water molecules Single processor Intel Xeon, 3.06 GHz, 512kB cache, 2GB memory, g77 compiler
10
(Ang -1 ) R c R c (Ang) R G (Ang) h x (Ang) Direct (sec) Recip rocal (sec) charges0.605.203.500.6200.202.30 dipoles0.615.203.630.6200.292.66 quadrupoles0.704.803.450.5160.524.64 octupoles0.754.253.100.4430.948.42 hexadecapoles0.794.253.050.3882.2115.71 Relative RMS force error: 5x10 -4 Single processor Intel Xeon, 3.06 GHz, 512kB cache, 2GB memory, g77 compiler Multigrid-based results for 4096 water molecules
11
J. Baucom et al, submitted to J. Chem. Phys. 2004 MD simulation of DNA decamer d(CCAACGTTGG) 2 in a crystal environment
12
charges only charges and induced dipoles RMS deviation of crystal simulations at constant pressure with respect to crystal structure
13
Calculating Multipole Moments via Wannier Functions (WFs) to partition the charge density and calculate the multipole moments, we use a WANNIER FUNCTION (WF) approach this has several advantages: 1. WFs provide for a chemical and physically intuitive way of partitioning the charge (ref. Marzari and Vanderbilt, PRB 56, 12847 (1997)) 2. WFs are distributed in space, which allows for a more faithful representation of the electrostatic potential 3. no ad hoc assignment of the charge 4. numerically quite stable procedure Ref: C. Sagui, P. Pomorski, T. Darden and C. Roland, J. Chem. Phys. 120, 4530 (2004)
14
Maximally localized Wannier functions for water water molecule has 4 WFs 2 associated with OH bond (light blue) 2 associated with O lone pairs (dark blue)
15
ab initiommdqmdqomdqoh Electrostatic potential for single water molecule as generated by WFs
16
Wannier functions for carbon dioxide CO2 has 8 WFs 6 associated with the CO bonds (light blue) 2 associated with O (dark blue)
17
ab initiommdqmdqomdqoh Electrostatic potential for carbon dioxide molecule as generated by WFs
18
Quantum Monte Carlo Developments
19
New continuous quantum Monte Carlo/molecular dynamics method we propose a new method for coupling ab initio molecular dynamics ionic with stochastic DMC electronic steps to provide accurate DMC energies “on-the-fly” exploits the slowness of MD evolution which enables to update the QMC sampling process very efficiently accurate for both thermal averages and description of energies along the pathways we have carried out the first QMC/MD simulations using both forces and energies from QMC Ref: J. Grossmann and L. Mitas, preprint 2004
20
Coupling of QMC and MD: Basic Idea Instead of discrete sampling of each point with a new QMC run: calculate QMC energies “on-the-fly” during the dynamic simulation ! Continuously update the DMC walkers so that they correctly represent the evolving wave function (CDMC method) Evolution of both configuration spaces is coupled: as the ionic dynamical trajectories evolve, so does the population of DMC electrons average distance made by an ion in one MD time step 10 -4 … 10 -3 a.u. average distance by an electron in a typical DMC time step 10 -2 … 10 -1 a.u.
21
Stable CDMC simulation ab initio MD Step Compute orbital overlaps with current DMC walkers Orbital swapping or rotation? do VMC then DMC yes no Check for node crossings compute weights Take DMC step(s) and calculate energy R, (R) Successful CDMC Algorithm Stable DMC population How accurate is it? Benchmark against discrete DMC
22
As simulation progresses, 1- step CDMC energies begin to differ significantly from discrete DMC Using 3 steps corrects time “lag” 33 times more efficient than discrete sampling CDMC: Number of DMC step needed per MD step Use large discrete sampled runs (1000 steps each) for comparison E(discrete DMC) = -6.228(2) E(1 step continuous)= -6.220(2) E(2 steps continuous)= -6.220(2) E(3 steps continuous)= -6.226(2) E(10 steps continuous)= -6.230(2) E(20 steps continuous)= -6.228(2) Thermal Averages (over 1 ps) Thermal averages are converged for N≥3 Same convergence (3 CDMC steps) observed for Si 2 H 6 and Si 5 H 12
23
CDMC: Si 2 H 6 As for SiH 4, asymmetrically stretch molecule and let go Average temperature ~ 1500 K
24
CDMC: Si 2 H 6 Results ForSi 2 H 6, 3 steps appears to lead to stability as for SiH 4 # steps looks like a function of dynamics rather than size Can pinpoint specific types of strain that lead to wf lag
25
Test of quantum Monte Carlo/molecular dynamics method on water dissociation DMC forces in very good agreement with DFT forces SiH 4 at 1500 K H 2 O Dissociation DMC-MD and DFT-MD trajectories are in excellent agreement “QMC only” molecular dynamics, with no external input from DFT
26
Density Functional Theory Developments
27
Development of hydrid QM calculations: interfacing DFT with Thomas-Fermi calculations Idea: in many biological systems, only part of the system is chemically active Use ab initio methods for this part (real-space, multigrid-based code) Use more approximate methods for the rest of the system (in this case Thomas Fermi approach, with frozen density for molecules, and gradient corrections) Ref: M. Hodak, W. Lu, and Bernholc, in preparation
28
Hybrid calculation tests ● Interaction of two water molecules in hybrid calculation - Hydrogen bonding test - Molecule 1: Ab initio - Molecule 2: Thomas-Fermi Gives estimated speed-up of 500 times !!
29
Parallelization and Coding Advances
30
A redesign of the AMBER SANDER program, along with a rewrite in FORTRAN 90 were undertaken with the goal of substantially improving the practicality of multi-nanosecond PME simulations (i.e., in 100,000 to 300,000 atom range) Resulting software has been released to the AMBER community in 3 phases – the new software is named PMEMD for PARTICLE MESH EWALD MOLECULAR DYNAMICS Improvements in performance and parallel scalability of AMBER MD Software Release Dates: July 2003 – PMEND 3.00 October 2003 – PMEMD 3.10 March 2004 – PMEMD 8.0 (part of AMBER 8.0) PMEMD is now the primary high performance MD modeling tool in AMBER !!
31
Results for the Factor IX constant pressure system from Dr. Lalith Perera, a solvated system with a total of 90906 atoms. The time step was 1.5 fs, the direct force cutoff was 8.0 angstrom, all simulations used PME. The runs were done on the IBM 1.3 GHz p690 Regatta at the Edinburgh Parallel Computing Centre. #procs PMEMD 8 PMEMD 3.1 PMEMD 3.0 Sander 8 Sander 7 Sander 6 psec/day psec/day psec/day psec/day psec/day psec/day 8 nd 346 353 nd 233 182 16 672 607 594 nd 279 258 32 1125 1035 929 nd 306 297 64 1975 1770 1127 369 318 339 96 2743 2304 nd nd nd nd 112 2945 2631 nd nd nd nd 128 2516* 2864 nd 339 nd nd * Performance falloff observed here. Max performance obtained at higher processor count was 3600 psec/day, but required using only 4 of the 8 cpu's on each 8 cpu multi-chip module of the SP4. Representative Benchmark
32
Software References Maximum throughput obtainable for SP4 is up an order of magnitude for PMEMD 8, than any other version of SANDER Software Publication References: R.E. Duke and L.G. Pedersen, PMEMD 3.0 (2003) R.E. Duke and L.G. Pedersen, PMEMD 3.1 (2003) D.A. Case et al, AMBER 8.0 (2004)
33
Some Applications … 1.QM/MM studies of enzymatic sulfate transfer in the heparan sulfate precursor 2. QM/MM studies of enzymatic sulfate transfer in estrogen sulfation 3.PMEMD study of the mammalian P450 enzyme and the ternary blood coagulation complex tissue factors 4. Protein folding study on ionic domains of the coagulation protein protothrombin 5. Solvation and deprotonation of formic acid 6. Crystallographic studies of DNA 7. Binding of vancomycin and teicoplanin antibiotics to bacterial cell wall termini 8. Structure and function of serine proteases 9. QM/MM study of role of Mg ions in the mechanism of DNA polymerase
34
K47 PAPS H107 E2 Mixed Quantum and Molecular Mechanics Simulations of Sulfuryl Transfer Reaction Catalyzed by Human Estrogen Sulfotransferase P. Lin and L. Pedersen
35
Mixed Quantum and Molecular Mechanics Simulations of Sulfuryl Transfer Reaction Catalyzed by Human Estrogen Sulfotransferase P. Lin and L. Pedersen estrogen is one of the most important hormones found in the human body it is extremely important that the body regulate estrogen, being able to both turn it on and off the deactivation of estrogen takes place by means of transfering a sulfate group to the hormone the details of this important reaction were investigated by means of a mixed quantum and classical molecular dynamics simulation, as shown in the movie movie shows how the sulfate group gets placed on the estrogen
36
Summary Scientific aims are to produce a set of scalable and portable computational tools for multiscale biomolecular calculations Considerable progress in number of aspects: 1. Development of accurate and efficient methods for treatment of long-range electrostatic forces 2. Development of QMC and MD methods 3. Development of DFT and TF interface 4. PMEMD and AMBER 8.0 5. Applications
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.