Download presentation
Presentation is loading. Please wait.
Published byCandace Parks Modified over 9 years ago
1
Lecture 19: Free Energies in Modern Computational Statistical Thermodynamics: WHAM and Related Methods Dr. Ronald M. Levy ronlevy@temple.edu Statistical Thermodynamics
2
Definitions Canonical ensemble: focus on free energy changes that do not involve changing the number of particles, or their velocities, nor volume and temperature. Can work with configurational partition function: : system parameters and/or constraints Or equival ently:
3
Example 1: Configurational Free Energy Changes controls a conformational constraint (distance between two particles, dihedral angle, etc.). Call the constrained property f(q): only conformations that satisfy the constraint f(q)= (within a tolerance of d are allowed: Now: So: integration of the constrained Z over all the possible constraints gives the unconstrained Z =1 Free energy work for imposing the constraint:
4
it's basically an average of a delta function over the unconstrained ensemble. The delta function is non-zero only exactly at. In actual numerical applications we would consider some finite interval around a discrete set of i. Consider: Free energy for imposing constraint in interval :
5
where: therefore: At the ->0 continuous limit: probability density
6
The zero of the free energy is arbitrary: A ( ) is the potential of mean force along : discrete : continuous :
7
Lesson #1: configurational free energies can be computed by counting – that is by collecting histograms or probability densities over unrestricted trajectories. Probably most used method. Not necessarily the most efficient. Achievable range of free energies is limited: N samples, N-1 in bin 1 and one sample in bin 2 For N =10,000, A max ~ 5 kcal/mol at room temperature But in practice needs many more samples than this minimum to achieve sufficient precision
8
Method #2: Biased Sampling “umbrella” potential Any thermodynamic observable can be “unbiased”: x
9
Example: Unbiased probability for being near x i : Works only if biased potential depends only on x
10
distribution with biasing potential 1 (histogram probabilities p 1i ) distribution with biasing potential 2 (histogram probabilities p 2i ) distribution with biasing potential 3 (histogram probabilities p 3i ) Unbiased distribution p° i w1w1 w2w2 w3w3 How do we “merge” data from multiple biased simulations? distribution with biasing potential 1 (histogram probabilities p 1i ) distribution with biasing potential 2 (histogram probabilities p 2i ) distribution with biasing potential 3 (histogram probabilities p 3i ) Unbiased distribution p° 1i w1w1 w2w2 w3w3 Unbiased distribution p° 2i Unbiased distribution p° 3i ? Any set of weights u si gives the right answer, but what is the best set of u si for a given finite sample size?
11
Multiple biased simulations to cover conformational space Biasing potentials w s (x) x
12
Weighted Histogram Analysis Method (WHAM): Optimal way of combining multiple sources of biased data unbiased probability bias of bin i in simulation s free energy factor Let's assume the unbiased probability is known, we can predict what would be the biased distribution with biased potential w s : biased probability Any set of weights u si gives the right answer, but what is the optimal (most accurate estimate) set of u si for a given finite sample size?
13
Likelihood of histogram at s given probabilities at s (multinomial distribution): In terms of unbiased probabilties: Joint likelihood of the histograms from all simulations:
14
Log likelihood: : Max likelihood principle: choose p i ° that maximize the likelihood of the observed histograms. Need to express f s in terms of p i ° :
15
Log likelihood: :
16
Thus (WHAM equations) : Solved by iteration until convergence. Compare with single simulation case derived earlier: WHAM gives both probabilities (PMFs) and state free energies Ferrenberg & Swendsen (1989) Kumar, Kollman et al. (1992) Bartels & Karplus (1997) Gallicchio, Levy et al. (2005)
17
What about those optimal combining weights we talked about? Substituting... Therefore... WHAM optimal combining weights A simulation makes a large contribution at bin i if: 1. It provides many samples (large N s ) 2. Its bias is small compared to its free energy relative to the unbiased state.
18
WHAM: getting unbiased averages Computing the average of an observable that depends only on x is straightforward: For a property that depends on some other coordinate y=y(q). Solve for p 0 (x,y) – no bias on y – and then integrate out x : where: From WHAM
19
Some WHAM Applications for PMFs Chekmarev, Ishida, & Levy (2004) J. Phys. Chem. B 108: 19487-19495 PMF of Alanine Dipeptide
20
2D Potential. bias = temperature Gallicchio, Andrec, Felts & Levy (2005) J. Phys. Chem. B 109: 6722-6731
21
-hairpin peptide. Bias = temperature Gallicchio, Andrec, Felts & Levy (2005) J. Phys. Chem. B 109: 6722-6731
22
The Concept of “WHAM weights” Consider the simple average of O(x) : Let's sum over individual samples rather than bins and set: Where: WHAM weight of sample k. Measures likelihood of encountering x k in unbiased simulation.
23
WHAM weights example: probability distribution Apply previous averaging formula to p° i That is the unbiased probability at x i is the sum of the WHAM weights of the sample belonging to that bin.
24
Latest development: No binning WHAM = MBAR WHAM equations: substitute Get: Sum over samples MBAR equation Solved iteratively to convergence to get the f 's Distributions obtained by binning the corresponding WHAM weights. Shirts & Chodera J. Chem. Phys. (2008). Tan, Gallicchio, Lapelosa, Levy JCTC (2012).
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.