Computational approaches for quantum many-body systems

Slides:



Advertisements
Similar presentations
The DMRG and Matrix Product States
Advertisements

APRIL 2010 AARHUS UNIVERSITY Simulation of probed quantum many body systems.
CS590M 2008 Fall: Paper Presentation
Combining Tensor Networks with Monte Carlo: Applications to the MERA Andy Ferris 1,2 Guifre Vidal 1,3 1 University of Queensland, Australia 2 Université.
2D and time dependent DMRG
On the formulation of a functional theory for pairing with particle number restoration Guillaume Hupin GANIL, Caen FRANCE Collaborators : M. Bender (CENBG)
Andy Ferris International summer school on new trends in computational approaches for many-body systems Orford, Québec (June 2012) Multiscale Entanglement.
Entanglement Renormalization Frontiers in Quantum Nanoscience A Sir Mark Oliphant & PITP Conference Noosa, January 2006 Guifre Vidal The University of.
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
Classical and Quantum Monte Carlo Methods Or: Why we know as little as we do about interacting fermions Erez Berg Student/Postdoc Journal Club, Oct
Monte Carlo Simulation of Interacting Electron Models by a New Determinant Approach Mucheng Zhang (Under the direction of Robert W. Robinson and Heinz-Bernd.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
Introduction to Tensor Network States Sukhwinder Singh Macquarie University (Sydney)
Using Fast Weights to Improve Persistent Contrastive Divergence Tijmen Tieleman Geoffrey Hinton Department of Computer Science, University of Toronto ICML.
CSC2535: Computation in Neural Networks Lecture 11: Conditional Random Fields Geoffrey Hinton.
PEPS, matrix product operators and the algebraic Bethe ansatz
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Geoffrey Hinton CSC2535: 2013 Lecture 5 Deep Boltzmann Machines.
CSC321: Neural Networks Lecture 24 Products of Experts Geoffrey Hinton.
CSC 2535 Lecture 8 Products of Experts Geoffrey Hinton.
Tensor networks and the numerical study of quantum and classical systems on infinite lattices Román Orús School of Physical Sciences, The University of.
NCN nanoHUB.org Wagner The basics of quantum Monte Carlo Lucas K. Wagner Computational Nanosciences Group University of California, Berkeley In collaboration.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 18 Learning Boltzmann Machines Geoffrey Hinton.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
Quasi-1D antiferromagnets in a magnetic field a DMRG study Institute of Theoretical Physics University of Lausanne Switzerland G. Fath.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 19: Learning Restricted Boltzmann Machines Geoffrey Hinton.
Copyright© 2012, D-Wave Systems Inc. 1 Quantum Boltzmann Machine Mohammad Amin D-Wave Systems Inc.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
KITPC Max Planck Institut of Quantum Optics (Garching) Tensor networks for dynamical observables in 1D systems Mari-Carmen Bañuls.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
CSC2535 Lecture 5 Sigmoid Belief Nets
CSC Lecture 23: Sigmoid Belief Nets and the wake-sleep algorithm Geoffrey Hinton.
Density matrix and its application. Density matrix An alternative of state-vector (ket) representation for a certain set of state-vectors appearing with.
Comp. Mat. Science School Electrons in Materials Density Functional Theory Richard M. Martin Electron density in La 2 CuO 4 - difference from sum.
Quantum Boltzmann Machine
Introduction to Quantum Monte Carlo Methods 2! Claudio Attaccalite.
Hunting Anomalous Excitations in BCC Helium-4 Jaron T. Krogel 1 Saad Khairallah 2 David Ceperley 1 1 Department of Physics, University of Illinois at Urbana-Champaign,
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Bayesian Neural Networks
Some Slides from 2007 NIPS tutorial by Prof. Geoffrey Hinton
Back Propagation and Representation in PDP Networks
Learning Deep Generative Models by Ruslan Salakhutdinov
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
Restricted Boltzmann Machines for Classification
CSC321: Neural Networks Lecture 19: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Multimodal Learning with Deep Boltzmann Machines
A Simple Artificial Neuron
Restricted Boltzman Machines
Generalized DMRG with Tree Tensor Network
A brief introduction to neural network
The Materials Computation Center. Duane D. Johnson and Richard M
14. TMMC, Flat-Histogram and Wang-Landau Method
Hidden Markov Models Part 2: Algorithms
Chapter 3. Artificial Neural Networks - Introduction -
Deep Belief Nets and Ising Model-Based Network Construction
Ground state properties of first row atoms:
Regulation Analysis using Restricted Boltzmann Machines
DIAGRAMMATIC MONTE CARLO:
Quantum Convolutional Neural Networks (QCNN)
Boltzmann Machine (BM) (§6.4)
Back Propagation and Representation in PDP Networks
CSC321 Winter 2007 Lecture 21: Some Demonstrations of Restricted Boltzmann Machines Geoffrey Hinton.
Introduction to Neural Networks
Computational approaches for quantum many-body systems
Computational approaches for quantum many-body systems
CSC 578 Neural Networks and Deep Learning
CS249: Neural Language Model
A quantum machine learning algorithm based on generative models
An introduction to neural network and machine learning
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Computational approaches for quantum many-body systems HGSFP Graduate Days SS2019 Martin Gärttner

Course overview Lecture 1: Introduction to many-body spin systems Quantum Ising model, Bloch sphere, tensor structure, exact diagonalization Lecture 2: Collective spin models LMG model, symmetry, semi-classical methods, Monte Carlo Lecture 3: Entanglement Mixed states, partial trace, Schmidt decomposition Lecture 4: Tensor network states Area laws, matrix product states, tensor contraction, AKLT model Lecture 5: DMRG and other variational approaches Energy minimization, PEPS and MERA, neural quantum states

Learning goals After today you will be able to … … explain how to find ground states using the MPS ansatz (DMRG). … dig deeper into tensor network states (PEPS and MERA) … explain alternative variational ansätze inspired by neural networks.

Tensor network states beyond MPS: Extensions and applications Projected entangled pair states (PEPS): → extension to 2D Problem: No efficient contraction:

Tensor network states beyond MPS: Extensions and applications Multiscale entanglement renormalization ansatz (MERA) Treat quantum critical ground states ADS (2+1) CFT (1+1)

Libraries for Tensor network states iTensor: C++ library for tensor network state calculations. http://itensor.org/ ALPS (Algorithms and Libraries for Physics Simulations). Contains many different numerical methods for quantum many-body systems, not only spins. Especially also quantum Monte Carlo methods. MPS: http://alps.comp-phys.org/static/mps_doc/ Open Source MPS (OSMPS), Python frontend! https://openmps.sourceforge.io/

Other variational approaches Variational Monte Carlo (VMC): 𝐸 𝑎 = 𝜓(𝑎) 𝐻 𝜓(𝑎) 𝜓(𝑎) 𝜓(𝑎) = 𝜎 𝜎 ′ 𝜓 𝜎 ′ ∗ (𝑎) 𝐻 𝜎 ′ 𝜎 𝜓 𝜎 (𝑎) 𝜎 𝜓 𝜎 (𝑎) 2 = 𝜎 𝜓 𝜎 (𝑎) 2 𝜎 ′ 𝜓 𝜎 ′ ∗ (𝑎) 𝐻 𝜎 ′ 𝜎 / 𝜓 𝜎 ∗ (𝑎) 𝜎 𝜓 𝜎 (𝑎) 2 Sample states according to Local energies: 𝑃 𝜎 = 𝜓 𝜎 (𝑎) 2 𝜎 𝜓 𝜎 (𝑎) 2 𝐸 𝑙𝑜𝑐 (𝜎)= 𝜎 ′ 𝜓 𝜎 ′ ∗ (𝑎) 𝐻 𝜎 ′ 𝜎 / 𝜓 𝜎 ∗ (𝑎)

Neural-network quantum states See also: Deng , Li, Das Sarma, PRX 2017, PRB 2017 Gao, Duan, Nat. Commun. 2017 Cirac et al., PRX 2018 Clark, J. Phys. A 2018 Moore, arXiv 2017 Carleo, Nomura, Imada, arxiv 2018 Freitas, Morigi, Dunjko, arXiv 2018 …… [Carleo and Troyer, Science 2017] Restricted Boltzmann machine | 𝜓 = 𝑖 1 … 𝑖 𝑁 𝑐 𝑖 1 … 𝑖 𝑁 | 𝑖 1 … 𝑖 𝑁 | 𝜓 = 𝐯 𝑐 𝐯 | 𝐯 𝑐 𝐯 = {𝐡} 𝑒 −𝐸[𝐯,𝐡] Classical networks: probability This is an ansatz that was proposed by Giuseppe which I want to summarize in the following. The starting point is again a general quantum state. Multiindex v Express using a parameterization that is motivated by a network architecture called restricted Boltzmann machine. Visible and hidden spins (all -1 or 1) Parameters: Connections only between them (interactions), biases (energy penalty) This gives the energy of the network Ansatz is to sum over the hiddens Difference to usual RBMs: coeffs are complex, thus weights also, not a probability distribution. 𝐸 𝐯,𝐡 =− 𝑖,𝑗 𝑊 𝑖𝑗 v 𝑖 h 𝑗 − 𝑖 𝑎 𝑖 v 𝑖 − 𝑗 𝑏 𝑗 h 𝑗 v 𝑖 , h 𝑗 ∈{−1,1}

Neural-network quantum states v 1 v 2 v 3 v 4 v 5 h 1 h 2 h 3 h 4 h 5 h 6 𝑎 1 𝑎 2 𝑏 1 𝑏 2 𝑊 11 𝑊 42 . . . Neural-network quantum states Efficient evaluation 𝑐 𝐯 = {𝐡} 𝑒 𝑖,𝑗 𝑊 𝑖𝑗 v 𝑖 h 𝑗 + 𝑖 𝑎 𝑖 v 𝑖 + 𝑗 𝑏 𝑗 h 𝑗 = 𝑒 𝑖 𝑎 𝑖 v 𝑖 𝑗 2 cosh 𝑏 𝑗 + 𝑖 𝑊 𝑖𝑗 v 𝑖 Back to neural network states The sum over the hidden neurons still has exponentially many terms. An important feature of RBM states is that the sum over the hidden neurons can be calculated explicitly. Simple rewriting of the sum as a product. Hidden spins don’t appear explicitly, that’s why they are called hidden. This is now simply a variational ansatz for the state which can be treated using the machinery of variational Monte Carlo methods. What is nice about this ansatz is that more parameters can be added naturally by simply adding more hidden neurons. This way we can hope to get convergence with respect to the number of hidden neurons. Problem of any representation: Must be efficiently evaluated. For tensor network states: Efficiently contractable, which is only possible for 1D. Not any more for higher dimensional tensor network states. visible hidden

Neural-network quantum states Finding ground states: Stochastic reconfiguration method 𝐸 𝑾 = 𝜓 𝐻 𝜓 = 𝐯, 𝐯 ′ 𝑐 𝐯 ′ ∗ 𝐻 𝐯 𝐯 ′ 𝑐 𝐯 Minimize energy functional 𝑊 𝑘 𝑝+1 = 𝑊 𝑘 𝑝 −𝛾 𝜕𝐸 𝜕 𝑊 𝑘 What we really want to use this NQS representation for is efficiently calculating ground states or unitary time evolution. For the former what we have to do is to chose the weight parameters such that the energy functional is minimized. How to do this is simply a gradient descent method. For each network parameter Wk the update step consists in subtracting a term proportional to gradient of the energy with respect to this parameter. On first sight this seems problematic because the gradients evolve sums over all visible configurations but here variational Monte Carlo tells us that the sum can be sampled efficiently by importance sampling according to the absolute values of the coefficients. We do this by a simple Markov chain Monte Carlo sampling procedure. Learning rate or imaginary time step. Learning rate Determine gradients by Monte Carlo sampling Imaginary time evolution

Neural-network quantum states Time evolution: Time dependent variational Monte Carlo 𝑅 𝑡,𝑊 𝑡 =𝑑𝑖𝑠𝑡[ 𝜕 𝑡 𝜓 𝑊 ,−𝑖𝐻𝜓] Minimize deviation from SE solution in each step Time-dependent variational principle 𝑊 𝑘 𝑡+∆𝑡 = 𝑊 𝑘 𝑡 −𝑖 ∆𝑡 𝜕𝐸 𝜕 𝑊 𝑘 What we really want to use this NQS representation for is efficiently calculating ground states or unitary time evolution. For the former what we have to do is to chose the weight parameters such that the energy functional is minimized. How to do this is simply a gradient descent method. For each network parameter Wk the update step consists in subtracting a term proportional to gradient of the energy with respect to this parameter. On first sight this seems problematic because the gradients evolve sums over all visible configurations but here variational Monte Carlo tells us that the sum can be sampled efficiently by importance sampling according to the absolute values of the coefficients. We do this by a simple Markov chain Monte Carlo sampling procedure. Learning rate or imaginary time step. time step Determine gradients by Monte Carlo sampling Real time evolution

References Ulrich Schollwoeck: The density-matrix renormalization group in the age of matrix product states, Annals of Physics 326, 96 (2011) Time dependent variational principle: Phys. Rev. Lett. 107, 070601 (2011) MERA and AdS/CFT: e.g. Phys. Rev. D 86, 065007 (2012) Neural network quantum state ansatz: Giuseppe Carleo, Matthias Troyer: Solving the Quantum Many-Body Problem with Artificial Neural Networks, Science 355, 602 (2017)