Presentation is loading. Please wait.

Presentation is loading. Please wait.

Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center.

Similar presentations


Presentation on theme: "Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center."— Presentation transcript:

1 Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center

2 Outline Enhanced sampling of pdfs Dynamical systems Stochastic kinetics flat histograms multicanonical method Wang-Landau transition probability method parallel tempering

3 Enhanced sampling techniques

4 Preliminaries define: variables x, , N a function  U(x, ,N) a probability: marginalize x define “free energy” or “thermodynamic potential” partition function Q(x, ,N) energies are Boltzmann-distributed  = 1/kT

5 The problem: What to do if is difficult to estimate because we can’t get sufficient statistics for all of interest.

6 Flat histogram approach pdf sampled uniformly for all , N weight 

7 Example: original pdf weighted pdf marginalization “canonical” partition function 1.get  2.get Q

8 General MC sampling scheme insertion deletion insertion deletion adjust weights adjust free energy free energy

9 Multicanonical method bin count shift normalization of  Berg and Neuhaus, Phys. Rev. Lett. 68, 9 (1992)

10 The algorithm Start with any weights (e.g.  1 (N) = 0) Perform a short simulation and measure P(N;  1 ) as histogram Update weights according to Iterate until P(N;  1 ) is flat or better

11 Typical example

12 Wang-Landau sampling entropy acceptance criterion Wang and Landau, Phys. Rev. Lett. 86, 2050 (2001), Phys. Rev. E 64, 056101 (2001) update constant Example: estimate entropies for (discrete) states

13 The algorithm Set entropies of all states to zero; set initial g Accept/reject according to the criterion: Always update the entropy estimate for the end state When the pdf is flat reduce g

14 Transition probability method ij K I J Wang, Tay, Swendsen, Phys. Rev. Lett., 82 476 (1999) Fitzgerald et al. J. Stat. Phys. 98, 321 (1999)

15 detailed balance macroscopic detailed balance

16 Parallel tempering

17 Dynamical systems

18 The idea: the system evolves according to equations of motion (possibly Hamiltonian) we need to assign masses to variables Assumption -ergodicity 

19 Advantages No need to design sampling techniques Specialized methods for efficient sampling are available (Laio-Parrinello, Adaptive Biasing Force) No probabilistic sampling Possibly complications with assignment of masses Disadvantages

20 Two formulations: Hamiltonian Lagrangian Numerical, iterative solution of equations of motion (a trajectory)

21 Assignment of masses Masses too large - slow motions Masses too small - difficult integration of equations of motion Large separation of masses - adiabatic separation Thermostats are available Lagrangian - e.g. Nose-Hoover Hamiltonian - Leimkuhler Energy equipartition needs to be addressed

22 Adaptive Biasing Force  A =  a   b  ∂H(  )/∂    * d  * Darve and Pohorile, J. Chem. Phys. 115:9169-9183 (2001). force AA

23 Summary A variety of techniques are available to sample efficiently rarely visited states. Adaptive methods are based on modifying sampling while building the solution. One can construct dynamical systems to seek the solution and efficient adaptive techniques are available. But one needs to do it carefully.

24 Stochastic kinetics

25 The problem {X i } objects, i = 1,…N n i copies of each objects undergo r transformations With rates {k  },  = 1,…r {k  } are constant The process is Markovian (well-stirred reactor) Assumptions

26 Example 7 objects 5 transformations

27 Deterministic solution concentrations kinetics (differential equations) steady state (algebraic equations) Works well for large {n i } (fluctuations suppressed)

28 A statistical alternative generate trajectories which reaction occurs next? when does it occur? next reaction is  at time  next reaction is  at any time any reaction at time 

29 Direct method - Algorithm Initialization Calculate the propensities {a i } Choose  (r.n.) Choose  (r.n.) Update no. of molecules and reset t  t+  Go to step 2 Gillespie, J. Chem. Phys. 81, 2340 (1977)

30 First reaction method -Algorithm Initialization Calculate the propensities {a i } For each  generate   according to (r.n.) Choose reaction for which is   the shortest Set  =   Update no. of molecules and reset t  t+  Go to step 2 Gillespie, J. Chem. Phys. 81, 2340 (1977)

31 Next reaction method Complexity - O(log r) Gibson and Bruck, J. Phys. Chem. A 104 1876 (2000)

32 Extensions k = k(t) (GB) Non-Markovian processes (GB) Stiff reactions (Eric van den Eijden) Enzymatic reactions (A.P.)


Download ppt "Exploring the connection between sampling problems in Bayesian inference and statistical mechanics Andrew Pohorille NASA-Ames Research Center."

Similar presentations


Ads by Google