1 CE 530 Molecular Simulation Lecture 22 Chain-Molecule Sampling Techniques David A. Kofke Department of Chemical Engineering SUNY Buffalo

Slides:



Advertisements
Similar presentations
PRAGMA – 9 V.S.S.Sastry School of Physics University of Hyderabad 22 nd October, 2005.
Advertisements

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Monte Carlo Simulation Wednesday, 9/11/2002 Stochastic simulations consider particle interactions. Ensemble sampling Markov Chain Metropolis Sampling.
Molecular Dynamics at Constant Temperature and Pressure Section 6.7 in M.M.
Lecture 13: Conformational Sampling: MC and MD Dr. Ronald M. Levy Contributions from Mike Andrec and Daniel Weinstock Statistical Thermodynamics.
Monte Carlo Methods and Statistical Physics
BAYESIAN INFERENCE Sampling techniques
Lecture 14: Advanced Conformational Sampling
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
The Calculation of Enthalpy and Entropy Differences??? (Housekeeping Details for the Calculation of Free Energy Differences) first edition: p
The Gibbs sampler Suppose f is a function from S d to S. We generate a Markov chain by consecutively drawing from (called the full conditionals). The n’th.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
CE 400 Honors Seminar Molecular Simulation Prof. Kofke Department of Chemical Engineering University at Buffalo, State University of New York Class 4.
Monte Carlo Simulation of Ising Model and Phase Transition Studies
1 CE 530 Molecular Simulation Lecture 16 Dielectrics and Reaction Field Method David A. Kofke Department of Chemical Engineering SUNY Buffalo
Continuous Time Monte Carlo and Driven Vortices in a Periodic Potential V. Gotcheva, Yanting Wang, Albert Wang and S. Teitel University of Rochester Lattice.
Monte Carlo Methods in Partial Differential Equations.
CE 530 Molecular Simulation
1 CE 530 Molecular Simulation Lecture 3 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Elements of the Heuristic Approach
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Advanced methods of molecular dynamics Monte Carlo methods
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Monte Carlo Methods: Basics
Monte-Carlo simulations of the structure of complex liquids with various interaction potentials Alja ž Godec Advisers: prof. dr. Janko Jamnik and doc.
Free energies and phase transitions. Condition for phase coexistence in a one-component system:
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
1 CE 530 Molecular Simulation Lecture 17 Beyond Atoms: Simulating Molecules David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 CE 530 Molecular Simulation Lecture 18 Free-energy calculations David A. Kofke Department of Chemical Engineering SUNY Buffalo
Basic Monte Carlo (chapter 3) Algorithm Detailed Balance Other points.
Microscopic states (microstates) or microscopic configurations under external constraints (N or , V or P, T or E, etc.)  Ensemble (micro-canonical, canonical,
1 CE 530 Molecular Simulation Lecture 6 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Nathan Baker BME 540 The Monte Carlo method Nathan Baker BME 540.
For a new configuration of the same volume V and number of molecules N, displace a randomly selected atom to a point chosen with uniform probability inside.
The Ising Model Mathematical Biology Lecture 5 James A. Glazier (Partially Based on Koonin and Meredith, Computational Physics, Chapter 8)
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 14 Some materials adapted from Prof. Keith E. Gubbins:
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Statistical Mechanics and Multi-Scale Simulation Methods ChBE
1 M.Sc. Project of Hanif Bayat Movahed The Phase Transitions of Semiflexible Hard Sphere Chain Liquids Supervisor: Prof. Don Sullivan.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 25 Some materials adapted from Prof. Keith E. Gubbins:
Protein Folding and Modeling Carol K. Hall Chemical and Biomolecular Engineering North Carolina State University.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
1 CE 530 Molecular Simulation Lecture 10 Simple Biasing Methods David A. Kofke Department of Chemical Engineering SUNY Buffalo
Schematic representation of ‘Microstates’. Integretion over a Markovian Web. G. C. Boulougouris, D. Frenkel, J. Chem. Theory Comput. 2005, 1,
Molecular Modelling - Lecture 2 Techniques for Conformational Sampling Uses CHARMM force field Written in C++
1 CE 530 Molecular Simulation Lecture 14 Molecular Models David A. Kofke Department of Chemical Engineering SUNY Buffalo
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
Monte Carlo method: Basic ideas. deterministic vs. stochastic In deterministic models, the output of the model is fully determined by the parameter values.
Interacting Molecules in a Dense Fluid
An Introduction to Monte Carlo Methods in Statistical Physics Kristen A. Fichthorn The Pennsylvania State University University Park, PA
General considerations Monte Carlo methods (I). Averages X = (x 1, x 2, …, x N ) – vector of variables P – probability density function.
Monte Carlo in different ensembles Chapter 5
Monte Carlo methods (II) Simulating different ensembles
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 19 Some materials adapted from Prof. Keith E. Gubbins:
Tao Peng and Robert J. Le Roy
1 CE 530 Molecular Simulation Lecture 21 Histogram Reweighting Methods David A. Kofke Department of Chemical Engineering SUNY Buffalo
Statistical Mechanics and Multi-Scale Simulation Methods ChBE
Basic Monte Carlo (chapter 3) Algorithm Detailed Balance Other points non-Boltzmann sampling.
Lecture 14: Advanced Conformational Sampling Dr. Ronald M. Levy Statistical Thermodynamics.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
Computational Physics (Lecture 10) PHY4370. Simulation Details To simulate Ising models First step is to choose a lattice. For example, we can us SC,
Systematic errors of MC simulations Equilibrium error averages taken before the system has reached equilibrium  Monitor the variables you are interested.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Computational Physics (Lecture 10)
Common Types of Simulations
CE 530 Molecular Simulation
Presentation transcript:

1 CE 530 Molecular Simulation Lecture 22 Chain-Molecule Sampling Techniques David A. Kofke Department of Chemical Engineering SUNY Buffalo

2 Monte Carlo Sampling  MC method permits great flexibility in developing improved sampling methods  Biasing methods improve sampling without changing the limiting distribution Modification of trial probabilities compensated by changes in acceptance and reverse-trial probabilities  Non-Boltzmann sampling methods modify the limiting distribution Desired ensemble average obtained by taking a weighted average over the non- Boltzmann sample  W 0 0

3 Simulating Chain Molecules  Slow to explore different parts of phase space  Concerted moves needed to detangle chains  Algorithms based solely on single-atom moves may be non- ergodic

4 Modeling Chain Molecules  Detailed models use full array of potentials discussed previously LJ atoms, with torsion, bend, stretch intramolecular potentials  Other models try to explain qualitative features of polymer behavior hard- or soft-sphere atoms, only stretch bead-spring; tangent spheres; finitely-extensible nonlinear elastic (FENE) each unit of model might represent a multi-unit segment of the true polymer only feasible approach for very long chains >10 3 units  Lattice models are very helpful discretize space various choices for lattice symmetry chain occupies contiguous sites on lattice one chain unit per site

5 Generating Configurations of Chains  Open ensembles (grand-canonical) often preferred insertion and removal of chains enhances sampling of configurations  Insertions and removals are difficult!  We’ll examine three approaches Simple sampling Configurational bias Pruned-enriched sampling  Consider methods in the context of a simple hard-exclusion model (no attraction, no bending energy) All non-overlapping chain configurations are weighted uniformly

6 Simple Sampling  Molecules are inserted and deleted in an unbiased fashion Stepwise insertion: after j segments have been inserted, the (j+1)th segment is placed at random at one of the sites adjoining the last segment Any attempt that leads to an overlap with an existing segment causes the whole trial to be immediately discarded reject continue trial Successful insertion

7 Simple Sampling: Insertion Likelihood  What is the probability that this trial will occur using simple insertion? In-class assignment 1 figure it out 63 sites

8 Simple Sampling: Insertion Likelihood  What is the probability that this trial will occur using simple insertion?  Insertion probability for first unit 1/63  Insert six more units, each with probability 1/3 going in the “right” spot 1/3 6  Could begin on either end of chain multiply by 2  Total probability is product 63 sites

9 Configurational Bias Monte Carlo  Based on 1955 idea of Rosenbluth & Rosenbluth  Apply bias during growth of chain, so that overlaps do not lead to rejection of entire trial  Remove bias during acceptance of complete trial Accumulate “Rosenbluth weight” during course of trial continue trial

10 Configurational-Bias Insertion/Deletion Trial. Analysis of Trial Probabilities  Detailed specification of trial moves and and probabilities Forward-step trial probability Reverse-step trial probability We’ll work this out later

11 Configurational-Bias Insertion/Deletion Trial. Analysis of Detailed Balance Detailed balance ii  ij jj  ji = Limiting distribution Forward-step trial probability Reverse-step trial probability

12 Configurational-Bias Insertion/Deletion Trial. Analysis of Detailed Balance Detailed balance ii  ij jj  ji = Forward-step trial probability Reverse-step trial probability Acceptance probability Energy is zero in both configurations

13 Rosenbluth Weight  What is W?  1/W is the probability that the chain would be inserted into the given position Each placement of a unit in the chain is selected with probability where w i is the number of non-overlap “sibling” alternatives available at generation i of the overall insertion Probability of making this particular insertion is

14 NVT Configuration Sampling  CBMC is also used to generate new configurations of present molecules  Acceptance of any move is based on Rosenbluth weight for given move and the reverse move W A = 63 W B = 252 The move A  B is accepted with probability 1 The move B  A is accepted with probability 63/252 = 1/4 AB

15 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is Before, this was a sum of terms either zero or one (for example)

16 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is 2 1 2

17 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is 2 1

18 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is 2 2

19 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is 1

20 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is In-class assignment 2 Get the next term

21 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is 2 1 1

22 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is

23 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is

24 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is

25 Attractive Interactions  Molecules with attraction Generalization uses Boltzmann factor to formulate Rosenbluth weight At each step weight is And probability of selecting site j is  W is used just as before: accept with proby min[1,W new /W old ] energy contribution is built-in:

26 A General Result for Markov Processes 1.  Consider a process in which there are several ways to generate each trial i  j  To enforce detailed balance, all routes should be considered in formulating acceptance probability  If there are many ways to generate the trial, this can pose difficulties

27 A General Result for Markov Processes 1.  Consider the following recipe for a single-step trial Generate the trial i  j via route (a), with probability  ij (a) d Choose a reverse trial j  i via one of the routes, say (b) Choose it with probability that it would occur as the j  i route Probability =  ji (b) /  ji Accept the (forward) trial as if (a) and (b) were the only routes  This recipe satisfies detailed balance for the overall transition i  j

28  CBMC can be extended to off-lattice models  Choose a set of trial orientations at random for each atom insertion  Once a chain is generated in new position, perform same operation tracing out its original location  Compile Rosenbluth weight for new and original chains to use in acceptance  Note that each insertion may be accomplished via multiple routes, differing in the discarded atom trials Off the Lattice et cetera

29 CBMC General Comments  Method begins to fail for sufficiently long chains maybe as few as 10 atoms  Extensions of method Gibbs ensemble Branched polymers Partial chain regrowth Chemical-potential calculation  General idea can be applied in other ways Multi-step trial broken into smaller decisions, with acceptance including consideration of the choices not taken

30 Parallel Tempering 1.  At high temperature a broader range of configurations is sampled  Barriers to transitions are lowered  How to simulate a low-temperature system with high- temperature barrier removal? Phase space,  Ensemble weight, e -u/kT High temperature Low temperature trapped escape

31 Parallel Tempering 2.  Simulate loosely coupled high- and low-temperature systems in parallel  Perform moves in which two systems swap configurations  Accept based on High temperatureLow temperature

32 Parallel Tempering 3.  To get reasonable acceptance rate, temperatures should not be too different  Can be extended to include any number of systems simulated in parallel  Can be extended to do “tempering” in other variables, such as the chemical potential  Very well suited for use in conjunction with histogram reweighting

33 Pruned-Enriched Rosenbluth Method  At some point along the growth process it may become clear that the chain is doomed, or the chain is really doing well  We’d like to enrich the presence of the good ones, while pruning out the ones that look bad  Use a criterion based on partial Rosenbluth weight

34 Pruned-Enriched Rosenbluth Method So far, so good. Let’s make another Not so good. Time to prune Make another Other branches Prune

35 Pruned-Enriched Rosenbluth Method  Set cutoffs for intermediate Rosenbluth weights duplicate any configuration having W > W >, halving weights of new duplicates prune configurations having W < W <, taking every-other such configuration, and doubling the weight of those not taken