1 Cluster Monte Carlo Algorithms: Jian-Sheng Wang National University of Singapore.

Slides:



Advertisements
Similar presentations
PRAGMA – 9 V.S.S.Sastry School of Physics University of Hyderabad 22 nd October, 2005.
Advertisements

Simulazione di Biomolecole: metodi e applicazioni giorgio colombo
Introduction to Markov Chain Monte Carlo Fall 2012 By Yaohang Li, Ph.D.
Theory of the pairbreaking superconductor-metal transition in nanowires Talk online: sachdev.physics.harvard.edu Talk online: sachdev.physics.harvard.edu.
Lecture 13: Conformational Sampling: MC and MD Dr. Ronald M. Levy Contributions from Mike Andrec and Daniel Weinstock Statistical Thermodynamics.
Monte Carlo Methods and Statistical Physics
Modern Monte Carlo Methods: (2) Histogram Reweighting (3) Transition Matrix Monte Carlo Jian-Sheng Wang National University of Singapore.
Dynamics of Learning VQ and Neural Gas Aree Witoelar, Michael Biehl Mathematics and Computing Science University of Groningen, Netherlands in collaboration.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Lecture 3: Markov processes, master equation
Graduate School of Information Sciences, Tohoku University
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
1 Quantum Monte Carlo Methods Jian-Sheng Wang Dept of Computational Science, National University of Singapore.
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
Ising Model Dr. Ernst Ising May 10, 1900 – May 11, 1998.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
1 Cluster Monte Carlo Algorithms & softening of first-order transition by disorder TIAN Liang.
Monte Carlo Simulation of Ising Model and Phase Transition Studies
Monte Carlo Methods H. Rieger, Saarland University, Saarbrücken, Germany Summerschool on Computational Statistical Physics, NCCU Taipei, Taiwan.
Introduction to Monte Carlo Methods D.J.C. Mackay.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Presentation in course Advanced Solid State Physics By Michael Heß
Monte Carlo Simulation of Ising Model and Phase Transition Studies By Gelman Evgenii.
Relating computational and physical complexity Computational complexity: How the number of computational steps needed to solve a problem scales with problem.
Monte Carlo Methods: Basics
Classical and Quantum Monte Carlo Methods Or: Why we know as little as we do about interacting fermions Erez Berg Student/Postdoc Journal Club, Oct
Monte Carlo Simulation of Interacting Electron Models by a New Determinant Approach Mucheng Zhang (Under the direction of Robert W. Robinson and Heinz-Bernd.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
Free energies and phase transitions. Condition for phase coexistence in a one-component system:
Lecture 11: Ising model Outline: equilibrium theory d = 1
Introduction to Monte Carlo Simulation. What is a Monte Carlo simulation? In a Monte Carlo simulation we attempt to follow the `time dependence’ of a.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
Outline Review of extended ensemble methods (multi-canonical, Wang-Landau, flat-histogram, simulated tempering) Replica MC Connection to parallel tempering.
Universal Behavior of Critical Dynamics far from Equilibrium Bo ZHENG Physics Department, Zhejiang University P. R. China.
1 Worm Algorithms Jian-Sheng Wang National University of Singapore.
Basic Monte Carlo (chapter 3) Algorithm Detailed Balance Other points.
9. Convergence and Monte Carlo Errors. Measuring Convergence to Equilibrium Variation distance where P 1 and P 2 are two probability distributions, A.
Monte Carlo Methods in Statistical Mechanics Aziz Abdellahi CEDER group Materials Basics Lecture : 08/18/
The Ising Model Mathematical Biology Lecture 5 James A. Glazier (Partially Based on Koonin and Meredith, Computational Physics, Chapter 8)
1 Monte Carlo Methods Wang Jian-Sheng Department of Physics.
8. Selected Applications. Applications of Monte Carlo Method Structural and thermodynamic properties of matter [gas, liquid, solid, polymers, (bio)-macro-
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
Percolation Percolation is a purely geometric problem which exhibits a phase transition consider a 2 dimensional lattice where the sites are occupied with.
Workshop on Optimization in Complex Networks, CNLS, LANL (19-22 June 2006) Application of replica method to scale-free networks: Spectral density and spin-glass.
13. Extended Ensemble Methods. Slow Dynamics at First- Order Phase Transition At first-order phase transition, the longest time scale is controlled by.
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
Monte Carlo method: Basic ideas. deterministic vs. stochastic In deterministic models, the output of the model is fully determined by the parameter values.
An Introduction to Monte Carlo Methods in Statistical Physics Kristen A. Fichthorn The Pennsylvania State University University Park, PA
Generalized van der Waals Partition Function
1 Series Expansion in Nonequilibrium Statistical Mechanics Jian-Sheng Wang Dept of Computational Science, National University of Singapore.
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Percolation Percolation is a purely geometric problem which exhibits a phase transition consider a 2 dimensional lattice where the sites are occupied with.
Basic Monte Carlo (chapter 3) Algorithm Detailed Balance Other points non-Boltzmann sampling.
Monte Carlo Simulation of Canonical Distribution The idea is to generate states i,j,… by a stochastic process such that the probability  (i) of state.
Computational Physics (Lecture 10) PHY4370. Simulation Details To simulate Ising models First step is to choose a lattice. For example, we can us SC,
Computational Physics (Lecture 8) PHY4061. Inverse of a matrix The inverse of a matrix A using linear equating approach: A −1 i j = x i j, – for i, j.
Fluctuation relations in Ising models G.G. & Antonio Piscitelli (Bari) Federico Corberi (Salerno) Alessandro Pelizzola (Torino) TexPoint fonts used in.
Monte Carlo Simulation of the Ising Model Consider a system of N classical spins which can be either up or down. The total.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
8. Selected Applications
Computational Physics (Lecture 10)
1. What is a Monte Carlo method ?
Introduction to Quantum Monte Carlo Methods
11. Cluster Algorithms.
12. Reweighting Methods.
14. TMMC, Flat-Histogram and Wang-Landau Method
Lecture 2 – Monte Carlo method in finance
7. Metropolis Algorithm.
Presented by Rhee, Je-Keun
Presentation transcript:

1 Cluster Monte Carlo Algorithms: Jian-Sheng Wang National University of Singapore

2 Outline Introduction to Monte Carlo and statistical mechanical models Cluster algorithms Replica Monte Carlo

3 1. Introduction to MC and Statistical Mechanical Models

4 Stanislaw Ulam ( ) S. Ulam is credited as the inventor of Monte Carlo method in 1940s, which solves mathematical problems using statistical sampling.

5 Nicholas Metropolis ( ) The algorithm by Metropolis (and A Rosenbluth, M Rosenbluth, A Teller and E Teller, 1953) has been cited as among the top 10 algorithms having the "greatest influence on the development and practice of science and engineering in the 20th century."

6 The Name of the Game Metropolis coined the name “Monte Carlo”, from its gambling Casino. Monte-Carlo, Monaco

7 Use of Monte Carlo Methods Solving mathematical problems (numerical integration, numerical partial differential equation, integral equation, etc) by random sampling Using random numbers in an essential way Simulation of stochastic processes

8 Markov Chain Monte Carlo Generate a sequence of states X 0, X 1, …, X n, such that the limiting distribution is given P(X) Move X by the transition probability W(X -> X’) Starting from arbitrary P 0 (X), we have P n+1 (X) = ∑ X’ P n (X’) W(X’ -> X) P n (X) approaches P(X) as n go to ∞

9 Ergodicity [W n ](X - > X’) > 0 For all n > n max, all X and X’ Detailed Balance P(X) W(X -> X’) = P(X’) W(X’ -> X) Necessary and sufficient conditions for convergence

10 Taking Statistics After equilibration, we estimate: It is necessary that we take data for each sample or at uniform interval. It is an error to omit samples (condition on things).

11 Choice of Transition Matrix W The choice of W determines a algorithm. The equation P = PW or P(X)W(X->X’)=P(X’)W(X’->X) has (infinitely) many solutions given P. Any one of them can be used for Monte Carlo simulation.

12 Metropolis Algorithm (1953) Metropolis algorithm takes W(X->X’) = T(X->X’) min ( 1, P(X’)/P(X) ) where X ≠ X’, and T is a symmetric stochastic matrix T(X -> X’) = T(X’ -> X)

13

14 Model Gas/Fluid A collection of molecules interact through some potential (hard core is treated), compute the equation of state: pressure p as function of particle density ρ=N/V. (Note the ideal gas law) PV = N k B T

15 The Statistical Mechanics of Classical Gas/(complex) Fluids/Solids Compute multi-dimensional integral where potential energy

16 The Ising Model The energy of configuration σ is E(σ) = - J ∑ σ i σ j where i and j run over a lattice, denotes nearest neighbors, σ = ±1 σ = {σ 1, σ 2, …, σ i, … }

17 The Potts Model The energy of configuration σ is E(σ) = - J ∑ δ(σ i,σ j ) σ i = 1,2,…,q 1 See F. Y. Wu, Rev Mod Phys, 54 (1982) 238 for a review.

18 Metropolis Algorithm Applied to Ising Model (Single-Spin Flip) 1.Pick a site I at random 2.Compute  E=E(  ’)-E(  ), where  ’ is a new configuration with the spin at site I flipped,  ’ I =-   3.Perform the move if  < exp(-  E/kT), 0<  <1 is a random number

19 Boltzmann Distribution In statistical mechanics, thermal dynamic results are obtained by expectation value (average) over the Boltzmann (Gibbs) distribution Z is called partition function

20 2. Swendsen-Wang algorithm

21 Percolation Model Each pair of nearest neighbor sites is occupied by a bond with probability p. The probability of the configuration X is p b (1-p) M-b. b is number of occupied bonds, M is total number of bonds

22 Fortuin-Kasteleyn Mapping (1969) where K = J/(k B T), p =1-e -K, and q is number of Potts states, N c is number of clusters.

23 Sweeny Algorithm (1983) Heat-bath rates: w(· ->1  ) = p w(· ->  ) = 1 – p w(· -> 1 β ) = p/( (1-p)q +p ) w(· -> β) = (1-p)q/( (1-p)q + p ) P(X)  ( p/(1-p) ) b q Nc

24 Swendsen-Wang Algorithm (1987) An arbitrary Ising configuration according to K = J/(kT)

25 Swendsen-Wang Algorithm Put a bond with probability p = 1-e -K, if σ i = σ j

26 Swendsen-Wang Algorithm Erase the spins

27 Swendsen-Wang Algorithm Assign new spin for each cluster at random. Isolated single site is considered a cluster. Go back to P(σ,n) again

28 Swendsen-Wang Algorithm Erase bonds to finish one sweep. Go back to P(σ) again

29 Identifying the Clusters Hoshen-Kompelman algorithm (1976) can be used. Each sweep takes O(N).

30 Measuring Error Let Q t be some quantity of interest at time step t, then sample average is Q N = (1/N) ∑ t Q t We treat Q N as a random variable. By central limit theorem, Q N is normal distributed with a mean = and variance σ N 2 = - 2. standards for average over the exact distribution.

31 Estimating Variance H. Műller-Krumbhaar and K. Binder, J Stat Phys 8 (1973) 1.

32 Error Formula The above derivation gives the well-known error estimate in Monte Carlo as: where var(Q) = - 2 can be estimated by sample variance of Q t.

33 Time-Dependent Correlation Function and Integrated Correlation Time We define and

34 Critical Slowing Down TcTc T  The correlation time becomes large near T c. For a finite system  (T c )  L z, with dynamical critical exponent z ≈ 2 for local moves

35 Much Reduced Critical Slowing Down Comparison of exponential correlation times of Swendsen-Wang with single-spin flip Metropolis at T c for 2D Ising model From R H Swendsen and J S Wang, Phys Rev Lett 58 (1987) 86.   Lz  Lz

36 Comparison of integrated autocorrelation times at T c for 2D Ising model. J.-S. Wang, O. Kozan, and R. H. Swendsen, Phys Rev E 66 (2002)

37 Wolff Single-Cluster Algorithm void flip(int i, int s0) { int j, nn[Z]; s[i] = - s0; neighbor(i,nn); for(j = 0; j < Z; ++j) { if(s0 == s[nn[j]] && drand48() < p) flip(nn[j], s0); }

38 Replica Monte Carlo

39 Slowing Down at First- Order Phase Transition At first-order phase transition, the longest time scale is controlled by the interface barrier where β=1/(k B T), σ is interface free energy, d is dimension, L is linear size

40 Spin Glass Model A random interaction Ising model - two types of random, but fixed coupling constants (ferro J ij > 0) and (anti- ferro J ij < 0)

41 Replica Monte Carlo A collection of M systems at different temperatures is simulated in parallel, allowing exchange of information among the systems. β1β1 β2β2 β3β3 βMβM...

42 Moves between Replicas Consider two neighboring systems, σ 1 and σ 2, the joint distribution is P(σ 1,σ 2 )  exp [ -β 1 E(σ 1 ) –β 2 E(σ 2 ) ] = exp [ -H pair (σ 1, σ 2 ) ] Any valid Monte Carlo move should preserve this distribution

43 Pair Hamiltonian in Replica Monte Carlo We define  i =σ i 1 σ i 2, then H pair can be rewritten as The H pair again is a spin glass. If β 1 ≈β 2, and two systems have consistent signs, the interaction is twice as strong; if they have opposite sign, the interaction is 0.

44 Cluster Flip in Replica Monte Carlo  = +1  = -1 Clusters are defined by the values of  i of same sign, The effective Hamiltonian for clusters is H cl = - Σ k bc s b s c Where k bc is the interaction strength between cluster b and c, k bc = sum over boundary of cluster b and c of K ij. b c Metropolis algorithm is used to flip the clusters, i.e., σ i 1 -> -σ i 1, σ i 2 -> -σ i 2 fixing  for all i in a given cluster.

45 Comparing Correlation Times Correlation times as a function of inverse temperature β on 2D, ±J Ising spin glass of 32x32 lattice. From R H Swendsen and J S Wang, Phys Rev Lett 57 (1986) Replica MC Single spin flip

46 2D Spin Glass Susceptibility 2D +/-J spin glass susceptibility on 128x128 lattice, 1.8x10 4 MC steps. From J S Wang and R H Swendsen, PRB 38 (1988)   K 5.11 was concluded.

47 Heat Capacity at Low T c  T 2 exp(-2J/T) This result is confirmed recently by Lukic et al, PRL 92 (2004) slope = -2

48 Monte Carlo Renormalization Group Y H defined by with RG iterations for difference sizes in 2D. From J S Wang and R H Swendsen, PRB 37 (1988) 7745.

49 MCRG in 3D 3D result of Y H. MCS is 10 4 to 10 5, with 23 samples for L= 8, 8 samples for L= 12, and 5 samples for L= 16.

50 Conclusion Monte Carlo methods have broad applications Cluster algorithms eliminate the difficulty of critical slowing down Replica Monte Carlo works on frustrated and disordered systems