Introduction to Quantum Monte Carlo Methods

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

PRAGMA – 9 V.S.S.Sastry School of Physics University of Hyderabad 22 nd October, 2005.
Wave function approaches to non-adiabatic systems
Monte Carlo Methods and Statistical Physics
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
Graduate School of Information Sciences, Tohoku University
1 Quantum Monte Carlo Methods Jian-Sheng Wang Dept of Computational Science, National University of Singapore.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
1 Cluster Monte Carlo Algorithms & softening of first-order transition by disorder TIAN Liang.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
OMS 201 Review. Range The range of a data set is the difference between the largest and smallest data values. It is the simplest measure of dispersion.
The Monte Carlo Method: an Introduction Detlev Reiter Research Centre Jülich (FZJ) D Jülich
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Introduction to Monte Carlo Methods D.J.C. Mackay.
Statistical Methods For Engineers ChE 477 (UO Lab) Larry Baxter & Stan Harding Brigham Young University.
Chapter 7: Random Variables
Simulation of Random Walk How do we investigate this numerically? Choose the step length to be a=1 Use a computer to generate random numbers r i uniformly.
Monte Carlo Methods: Basics
Classical and Quantum Monte Carlo Methods Or: Why we know as little as we do about interacting fermions Erez Berg Student/Postdoc Journal Club, Oct
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Computational Solid State Physics 計算物性学特論 第8回 8. Many-body effect II: Quantum Monte Carlo method.
Random Sampling, Point Estimation and Maximum Likelihood.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Module 1: Statistical Issues in Micro simulation Paul Sousa.
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
Monte Carlo Methods in Statistical Mechanics Aziz Abdellahi CEDER group Materials Basics Lecture : 08/18/
Monte Carlo Methods Versatile methods for analyzing the behavior of some activity, plan or process that involves uncertainty.
Time-dependent Schrodinger Equation Numerical solution of the time-independent equation is straightforward constant energy solutions do not require us.
1 MODELING MATTER AT NANOSCALES 4. Introduction to quantum treatments The variational method.
NCN nanoHUB.org Wagner The basics of quantum Monte Carlo Lucas K. Wagner Computational Nanosciences Group University of California, Berkeley In collaboration.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
A. Ambrosetti, F. Pederiva and E. Lipparini
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Introduction to Computer Simulation of Physical Systems (Lecture 10) Numerical and Monte Carlo Methods (CONTINUED) PHYS 3061.
Computational Physics (Lecture 11) PHY4061. Variation quantum Monte Carlo the approximate solution of the Hamiltonian Time Independent many-body Schrodinger’s.
Introduction to Quantum Monte Carlo Methods 2! Claudio Attaccalite.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Virtual University of Pakistan
CHEM-E7130 Process Modeling Lecture 6
Lesson 8: Basic Monte Carlo integration
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Availability Availability - A(t)
Advanced Statistical Computing Fall 2016
Basic simulation methodology
1. What is a Monte Carlo method ?
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Statistical Mechanics and Multi-Scale Simulation Methods ChBE
Chapter 7: Sampling Distributions
The Materials Computation Center. Duane D. Johnson and Richard M
Statistical Methods For Engineers
Density Functional Theory (introduced for many-electron systems)
Unfolding Problem: A Machine Learning Approach
Hidden Markov Models Part 2: Algorithms
Lecture 2 – Monte Carlo method in finance
7. Metropolis Algorithm.
STOCHASTIC HYDROLOGY Random Processes
DIAGRAMMATIC MONTE CARLO:
Virtual University of Pakistan
Thermal Energy & Heat Capacity:
Opinionated Lessons #39 MCMC and Gibbs Sampling in Statistics
Statistical Methods for Data Analysis Random number generators
VQMC J. Planelles.
Unfolding with system identification
Presentation transcript:

Introduction to Quantum Monte Carlo Methods Claudio Attaccalite http://attaccalite.com

Outline A bit of probability theory Variational Monte Carlo Wave-Function and Optimization

Definition of probability P(Ei)= pi= Number of successful events Total Number of experiments In the limit of a large number of experiments 𝑖=1 𝑁 𝑝 𝑖 =1 joint probability: pi,j probability of composite events probability for j whatever the second event may be or not marginal probability: 𝑝 𝑖 = 𝑘 𝑝 𝑖,𝑘 probability for occurrence of j give that the event i occurred conditional probability: p(i|j)

More Definitions Mean Value: Variance: Standard deviation: 𝑥 = 𝑖 𝑥 𝑖 𝑝 𝑖 The mean value <x> is the expected average value after repeating several times the same experiment Variance: 𝑣𝑎𝑟 𝑥 = 𝑥 2 − 𝑥 2 = 𝑖 𝑥 𝑖 − 𝑥 2 𝑝 𝑖 The variance is a positive quantity that is zero only if all the events having a non-vanishing probability give the same value for the variable xi Standard deviation: σ= 𝑣𝑎𝑟 𝑥 The standard deviation is assumed as a measure of the dispersion of the variable x

Chebyshev's inequality 𝑃 ˉ =𝑃 𝑥− 𝑥 2 ≥ 𝑣𝑎𝑟 𝑥 δ ≤δ for δ≤1 If the variance is small the random variable x became “more” predictable, in the sense that is value xi at each event is close to <x> with a non-negligible probability

Extension to Continues Variables Cumulative probability : 𝐹 𝑦 =𝑃 𝑥≤𝑦 Clearly F(y) is a monotonically increasing function and 𝐹 ∞ =1 ρ 𝑦 = 𝑑𝐹 𝑦 𝑑𝑦 Density probability: Obviously: ρ 𝑦 ≥0 And for discrete distributions: ρ 𝑦 = 𝑖 𝑝 𝑖 δ 𝑦−𝑥 𝐸 𝑖

The law of large number Central Limit Theorem The average of x is obtained averaging over a large number N of independent realizations of the same experiment 𝑥 ˉ = 1 𝑁 𝑖 𝑥 𝑖 𝑥 ˉ = 𝑥 𝑣𝑎𝑟 𝑥 ˉ = 𝑥 2 ˉ − 𝑥 ˉ = 1 𝑁 𝑣𝑎𝑟 𝑥 ρ 𝑥 ˉ = 1 2π σ 2 𝑁 𝑒 − 𝑥 ˉ − 𝑥 2 2 σ 2 𝑁 The average of x is Gaussian distributed for large N and its standard deviation decrease as 1/sqrt(N) Central Limit Theorem

Monte Carlo Example: Estimating  If you are a very poor dart player, it is easy to imagine throwing darts randomly at the above figure, and it should be apparent that of the total number of darts that hit within the square, the number of darts that hit the shaded part is proportional to the area of that part.

In other words: 𝑃 𝑖𝑛𝑠𝑖𝑑𝑒 = π 𝑟 2 4 𝑟 2 = π 4 and:

A Simple Integral Consider the simple integral: This can be evaluated in the same way as the pi example. By randomly tossing darts in the interval a-b and evaluating the function f(x) on these points

The electronic structure problem P.A.M. Dirac:The fundamental laws necessary for the mathematical treatment of a large part of physics and the whole of chemistry are thus completely known, and the difficulty lies only in the fact that application of these laws leads to equations that are too complex to be solved.

Variational Monte Carlo Monte Carlo integration is necessary because the wave-function contains explicit particle correlations that leads to non-factoring multi-dimension integrals.

How to sample a given probability distribution?

Solution Markov chain: random walk in configuration space 𝑥 𝑛+1 =𝐹 𝑥 𝑛 , ξ 𝑛 A Markov chain is a stochastic dynamics for which a random variable xn evolves according to xn and xn+1 are not independent so we can define a joint probability to go from first to the second 𝑓 𝑛 𝑥 𝑛+1 , 𝑥 𝑛 =𝐾 𝑥 𝑛+1 ∣ 𝑥 𝑛 ρ 𝑛 𝑥 𝑛 Marginal probability to be in xn Conditional probability to go from xn to xn+1 ρ 𝑛+1 𝑥 𝑛+1 = 𝑥 𝑛 𝐾 𝑥 𝑛+1 ∣ 𝑥 𝑛 ρ 𝑛 𝑥 𝑛 Master equation:

Limit distribution of the Master equation ρ 𝑛+1 𝑥 𝑛+1 = 𝑥 𝑛 𝐾 𝑥 𝑛+1 ∣ 𝑥 𝑛 ρ 𝑛 𝑥 𝑛 1) Does It exist a limiting distribution? ρ ˉ 𝑥 2) Starting form a given arbitrary configuration under which condition we converge?

Sufficient and necessary conditions for the convergence The answer to the first question requires that: ρ ˉ 𝑥 𝑛+1 = 𝑥 𝑛 𝐾 𝑥 𝑛+1 ∣ 𝑥 𝑛 ρ ˉ 𝑥 𝑛 In order to satisfy this requirement it is sufficient but not necessary that the so-called detailed balance holds: 𝐾 𝑥′∣𝑥 ρ ˉ 𝑥 =𝐾 𝑥∣𝑥′ ρ ˉ 𝑥′ The answer to the second question requires ergodicity! Namely that every configuration x' can be reached in a sufficient large number of Markov interactions, starting from any initial configuration x

Nicholas Metropolis (1915-1999)‏ The algorithm by Metropolis (and A Rosenbluth, M Rosenbluth, A Teller and E Teller, 1953) has been cited as among the top 10 algorithms having the "greatest influence on the development and practice of science and engineering in the 20th century." “The code that was to become the famous Monte Carlo method of calculation originated from a synthesis of insights that Metropolis brought to more general applications in collaboration with Stanislaw Ulam in 1949. A team headed by Metropolis, which included Anthony Turkevich from Chicago, carried out the first actual Monte Carlo calculations on the ENIAC in 1948. Metropolis attributes the germ of this statistical method to Enrico Fermi, who had used such ideas some 15 years earlier. The Monte Carlo method, with its seemingly limitless potential for development to all areas of science and economic activities, continues to find new applications.” From “Physics Today” Oct 2000, Vol 53, No. 10., see also http://www.aip.org/pt/vol-53/iss-10/p100.html.

Solution! (by Metropolis and friends) Metropolis Algorithm We want 1) a Markov chain such that, for large n, converge to r(x) 2) a condition probability K(x'|x) that satisfy the detailed balance with this probability distribution Solution! (by Metropolis and friends) 𝐾 𝑥′∣𝑥 =𝐴 𝑥′∣𝑥 𝑇 𝑥′∣𝑥 𝐴 𝑥′∣𝑥 =𝑚𝑖𝑛 1, ρ ˉ 𝑥′ 𝑇 𝑥∣𝑥′ ρ ˉ 𝑥 𝑇 𝑥′∣𝑥 where T(x'|x) is a general and reasonable transition probability from x to x'

The Algorithm Important start from a random configuration x' generate a new one according to T(x'|x) accept or reject according to Metropolis rule evaluate our function Important It not necessary to have a normalized probability distribution (or wave-function!)

More or less we have arrived we can evaluate this integral 𝐴 = Ψ 𝑅 𝐴 ˆ Ψ 𝑅 𝑑𝑅 Ψ 𝑅 2 𝑑𝑅 = 𝐴 𝐿 𝑅 Ψ 2 𝑅 𝑑𝑅 Ψ 𝑅 2 𝑑𝑅 and its variance 𝑣𝑎𝑟 𝐴 = 𝐴 𝐿 2 𝑅 Ψ 2 𝑅 𝑑𝑅 Ψ 𝑅 2 𝑑𝑅 − 𝐴 2 but we just need a wave function . . . . . . .

The trial wave-function The trial-function completely determines quality of the approximation for the physical observables The simplest WF is the Slater-Jastrow Ψ 𝑟 1, 𝑟 2,. .., 𝑟 𝑛 =D∣ φ 𝐴 ∣exp 𝑈 𝑐𝑜𝑟𝑟 Det|f|: from DFT, CI, HF, scratch, etc.. other functional forms: pairing BCS, multi-determinant, pfaffian

Optimization strategies In order to obtain a good variational wave-function, it is possible to optimize the WF minimizing one of the following functionals or a linear combination of both 𝐸 𝑉 𝑎,𝑏,𝑐 = Ψ 𝑎,𝑏,𝑐.. 𝐻Ψ 𝑎,𝑏,𝑐... 𝑑𝑅 𝑛 Ψ Ψ The Variational Energy The Variance of the Energy: (always positive and 0 for exact ground state!) σ 2 𝑎,𝑏,𝑐... = 𝐻Ψ Ψ 2 Ψ 2 − 𝐸 𝑣 2

And finally an application!!!

2D electron gas Unpolarized phase Wigner Crystal The Hamiltonian : 𝑟 𝑆 = 1 π𝑛 𝑎 𝐵 𝐻= −1 2 𝑟 𝑠 2 𝑖 𝑁 ∇ 𝑖 2 + 1 𝑟 𝑠 𝑖<𝑗 𝑁 1 ∣ 𝑟 𝑖 − 𝑟 𝑗 ∣ Unpolarized phase Wigner Crystal

2D electron gas: the phase diagram We found a new phase of the 2D electron gas at low density a stable spin polarized phase before the Wigner crystallization.

Difficulties With VMC The many-electron wavefunction is unknown Has to be approximated May seem hopeless to have to actually guess the wavefunction But is surprisingly accurate when it works

The Limitation of VMC Nothing can really be done if the trial wavefunction isn’t accurate enough Moreover it favours simple states over more complicated ones Therefore, there are other methods Example: Diffusion QMC

Next Monday Then . . . . Diffusion Monte Carlo and Sign-Problem Applications Then . . . . Finite Temperature Path-Integral Monte Carlo One-dimensional electron gas Excited States One-body density matrix Diagramatic Monte Carlo

Reference SISSA Lectures on Numerical methods for strongly correlated electrons 4th draft S. Sorella G. E. Santoro and F. Becca (2008) Introduction to Diffusion Monte Carlo Method I. Kostin, B. Faber and K. Schulten, physics/9702023v1 (1995) FreeScience.info-> Quantum Monte Carlo http://www.freescience.info/books.php?id=35

Exact conditions Electron-Nuclei cusp conditions When one electron approach a nuclei the wave-function reduce to a simple hydrogen like, namely: The same condition holds when two electron meet electron-electron cusp condition and can be satisfied with a two-body Jastrow factor