F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo. The Monte Carlo method. Basic. Spin Systems. World-lines, loops and stochastic series expansions. The auxiliary field method I The auxiliary filed method II Ground state, finite temperature and Hirsch-Fye. Special topics (Kondo / Metal-Insulator transition) and outlooks.
Problem: ~10 23 electrons per cm 3. Question: Ground state, elementary excitations. Correlations (Coulomb) Correlations (Coulomb). Low energy elementary excitations remain particle and holes. Fermi liquid theory. Screening, phase space. 1D: Luttinger liquid. (Spinon, Holon) 2D: Fractional Quantum Hall effect. Magnetism. Mott insulators. Metal-insulator transition. Heavy fermions. Fermi statistics. No correlations. Fermi-sea. Elementary excitations: particle-holes. CPU time N 3 Some Generalities. Complexity of problem scales as e N
World-line approach with loop updates. Stochastic series expansion O(N) method. Non-frustrated spin systems. Bosonic systems. 1-D Hubbard and t-J models. Non-interacting electrons in dimensions larger than unity. Lattice Hamiltonian H. Trace over Fock space. Path integral. Not unique. Sign Problem. Approximate strategies: CPQMC, PIRG Determinantal method. O(N 3 ) method. Any mean-field Hamiltonian. Models with particle-hole symmetry. Half filled Hubbard. Kondo lattices. Models with attractive interactions Attractive Hubbard model Holstein model. Impurity problems. No.
The Monte Carlo Method. Basic ideas. Aim: Let and split the domain in hyper-cubes of linear size h and use an integration method where the systematic error scales as h k The systematic error in terms of the number of function evaluations N = V/h d of is then proportional to: Thus poor results for large values of d and the Monte Carlo method becomes attractive.
The central limit theorem. Let Be a set of statistically independent points distributed according to the probability distribution P(x). Then we can estimate What is the error? Distribution of X. For practical purposes we estimate: Thus the error (i.e. the width of the Gaussian distribution) scales as irrespective of the dimensionality of the integration space. Demonstration of the theorem.
An Example: Calculation of x y 1 In this case, Draw N {(x,y)} random points. x, y are drawn from uniform distribution in the interval [0:1] Take N=8000 to obtain Repeat this simulation many time to compute D(X) D(X)
Markov Chains: Generating points according to a distribution P(x). Define a Monte Carlo time dependent probability distribution: which evolves according to a Markov process: the future depends only on the present. The time evolution is given by: Requirement: Conditions on T: Stationarity condition is fulfilled if detailed balance condition is satisfied: But stationarity condition is essential not detailed balance!
Convergence to P(x). Rules. Rate of convergence. Eigenvalues,, of T satisfy corresponds to the stationary distribution. The rate of convergence will depend on the second largest eigenvalue 1. Let
Explicit construction of T. Probability of proposing a move from x to y. Has to satisfy the ergodicity condition (2) and (1). Probability of accepting the move. Note: (1) (2) (3) so that T satisfies (1) To satisfy (3) we will require detailed balance: Ansatz: MetropolisHeatbath
Ergodicity. To achieve ergodicity, one will often want to combine different types on moves. Let satisfy (1) and (3). (1) (2) (3) We can combine those moves randomly: or sequentially to achieve ergodicity. Note: If T (i), :1...N, satisfies the detailed balance condition then T R satisfies the detailed balance condition but T S satisfies only the stationarity condition.
Autocorrelation time and error analysis: Binning analysis. Monte Carlo simulation: 1) Start with configuration x 0 2) Propose a move from x 0 to y according to and accept it with probability 3) 4) Goto 1) Autocorrelation time: Generate a sequence: which if N is large enough will be distributed according to P(x) so that. Relevant time scale to forgett memory of intial configuration is
To use the central limit theorem to evaluate the error, we need statistically independent measurements. Binning. Group the raw data into bins of size and estimate the error with. If n is large enough (n~5-10) the error will be independent on n.
Example. The one dimensional Ising model. We want to compute spin-spin correlation functions: Algorithm. Choose a site randomly. Propose a spin flip. Accept with Metropolis or Heat-bath. Carry out the measurement e.g. after a sweep.
Example of error analysis L=24 1D Ising model: Unit is a single sweep Unit is the autocorrelation time as determined from (a) JJ g(L/2) exactg(L/2) MC / / Results obtained after 2X10 6 sweeps
Random number generators. Linear congruential Period: 2 31, 32 bit integer. Deterministic (i.e. pseudo ramdom). For a given initial value of I the sequence of random numbers is reprodicible. Quality checks. (1). Ditribution: X P(x) (Ref: Numerical recipes. Cambridge University Press)
(2) Correlations: (3) 2-tupels.
The generation of good pseudo random numbers is a quite delicate issue which requires some care and extensive quality check. It is therefore highly recommended not to invent ones secret recursion rules but to use one of the well-known generators which have been tested by many other workers in the field.