Combining Tensor Networks with Monte Carlo: Applications to the MERA Andy Ferris 1,2 Guifre Vidal 1,3 1 University of Queensland, Australia 2 Université.

Slides:



Advertisements
Similar presentations
Slice Sampling Radford M. Neal The Annals of Statistics (Vol. 31, No. 3, 2003)
Advertisements

Preparing Projected Entangled Pair States on a Quantum Computer Martin Schwarz, Kristan Temme, Frank Verstraete University of Vienna, Faculty of Physics,
Preparing Topological States on a Quantum Computer Martin Schwarz (1), Kristan Temme (1), Frank Verstraete (1) Toby Cubitt (2), David Perez-Garcia (2)
Bayesian Estimation in MARK
Inspiral Parameter Estimation via Markov Chain Monte Carlo (MCMC) Methods Nelson Christensen Carleton College LIGO-G Z.
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Approaches to Data Acquisition The LCA depends upon data acquisition Qualitative vs. Quantitative –While some quantitative analysis is appropriate, inappropriate.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
BAYESIAN INFERENCE Sampling techniques
Andy Ferris International summer school on new trends in computational approaches for many-body systems Orford, Québec (June 2012) Multiscale Entanglement.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
Monte Carlo Simulation Methods - ideal gas. Calculating properties by integration.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Herding: The Nonlinear Dynamics of Learning Max Welling SCIVI LAB - UCIrvine.
Computational statistics, course introduction Course contents  Monte Carlo Methods  Random number generation  Simulation methodology  Bootstrap  Markov.
Entanglement Renormalization Frontiers in Quantum Nanoscience A Sir Mark Oliphant & PITP Conference Noosa, January 2006 Guifre Vidal The University of.
Parallel Flat Histogram Simulations Malek O. Khan Dept. of Physical Chemistry Uppsala University.
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
Phylogeny Estimation: Traditional and Bayesian Approaches Molecular Evolution, 2003
Monte Carlo Methods: Basics
Classical and Quantum Monte Carlo Methods Or: Why we know as little as we do about interacting fermions Erez Berg Student/Postdoc Journal Club, Oct
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
Outline Review of extended ensemble methods (multi-canonical, Wang-Landau, flat-histogram, simulated tempering) Replica MC Connection to parallel tempering.
MCSL Monte Carlo simulation language Diego Garcia Eita Shuto Yunling Wang Chong Zhai.
9. Convergence and Monte Carlo Errors. Measuring Convergence to Equilibrium Variation distance where P 1 and P 2 are two probability distributions, A.
Kinetic Monte Carlo Triangular lattice. Diffusion Thermodynamic factor Self Diffusion Coefficient.
Monte Carlo Methods in Statistical Mechanics Aziz Abdellahi CEDER group Materials Basics Lecture : 08/18/
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
APPENDIX D R ANDOM N UMBER G ENERATION Organization of chapter in ISSO* – General description and linear congruential generators Criteria for “good” random.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Non-Bayes classifiers. Linear discriminants, neural networks.
Tensor networks and the numerical study of quantum and classical systems on infinite lattices Román Orús School of Physical Sciences, The University of.
Diffusion in Disordered Media Nicholas Senno PHYS /12/2013.
13. Extended Ensemble Methods. Slow Dynamics at First- Order Phase Transition At first-order phase transition, the longest time scale is controlled by.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
The Markov Chain Monte Carlo Method Isabelle Stanton May 8, 2008 Theory Lunch.
An Introduction to Monte Carlo Methods in Statistical Physics Kristen A. Fichthorn The Pennsylvania State University University Park, PA
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Bayesian statistics named after the Reverend Mr Bayes based on the concept that you can estimate the statistical properties of a system after measuting.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
KITPC Max Planck Institut of Quantum Optics (Garching) Tensor networks for dynamical observables in 1D systems Mari-Carmen Bañuls.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Genetic algorithm - Monte Carlo hybrid method for finding stable geometries of atomic clusters Application to carbon clusters Nazım Dugan, Şakir Erkoç.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Markov Chain Monte Carlo in R
Optimization of Monte Carlo Integration
Advanced Statistical Computing Fall 2016
ERGM conditional form Much easier to calculate delta (change statistics)
Generalized DMRG with Tree Tensor Network
Markov chain monte carlo
14. TMMC, Flat-Histogram and Wang-Landau Method
Sampling Distribution
Sampling Distribution
Filtering and State Estimation: Basic Concepts
Markov Chain Monte Carlo
Bucket Renormalization for Approximate Inference
VQMC J. Planelles.
Computational approaches for quantum many-body systems
Computational approaches for quantum many-body systems
Presentation transcript:

Combining Tensor Networks with Monte Carlo: Applications to the MERA Andy Ferris 1,2 Guifre Vidal 1,3 1 University of Queensland, Australia 2 Université de Sherbrooke, Québec 3 Perimeter Institute for Theoretical Physics, Ontario

Motivation: Make tensor networks faster Calculations should be efficient in memory and computation (polynomial in χ, etc) However total cost might still be HUGE (e.g. 2D) χ Parameters: d L vs. Poly(χ,d,L)

Monte Carlo makes stuff faster Monte Carlo: Random sampling of a sum – Tensor contraction is just a sum Variational MC: optimizing parameters Statistical noise! – Reduced by importance sampling over some positive probability distribution P(s)

Monte Carlo with Tensor networks

MPS: Sandvik and Vidal, Phys. Rev. Lett. 99, (2007). CPS: Schuch, Wolf, Verstraete, and Cirac, Phys. Rev. Lett. 100, (2008). Neuscamman, Umrigar, Garnet Chan, arXiv: (2011), etc… PEPS: Wang, Pižorn, Verstraete, Phys. Rev. B 83, (2011). (no variational) …

Monte Carlo with Tensor networks MPS: Sandvik and Vidal, Phys. Rev. Lett. 99, (2007). CPS: Schuch, Wolf, Verstraete, and Cirac, Phys. Rev. Lett. 100, (2008). Neuscamman, Umrigar, Garnet Chan, arXiv: (2011), etc… PEPS: Wang, Pižorn, Verstraete, Phys. Rev. B 83, (2011). (no variational) … Unitary TN: Ferris and Vidal, Phys. Rev. B 85, (2012). 1D MERA: Ferris and Vidal, Phys. Rev. B, 85, (2012).

Perfect vs. Markov chain sampling Perfect sampling: Generating s from P(s) Often harder than calculating P(s) from s! Use Markov chain update e.g. Metropolis algorithm: – Get random s’ – Accept s’ with probability min[P(s’) / P(s), 1] Autocorrelation: subsequent samples are “close”

Markov chain sampling of an MPS Choose P(s) = | | 2 where |s> = |s 1 >|s 2 > … Cost is O(χ 2 L) 2 <s1|<s1|<s2|<s2|<s3|<s3|<s4|<s4|<s5|<s5|<s6|<s6|’ Accept with probability min[P(s’) / P(s), 1] A. Sandvik & G. Vidal, PRL 99, (2007)

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … Cost is now O(χ 3 L) !

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … if = Unitary/isometric tensors:

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … Can sample in any basis…

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) …

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … Total cost now O(χ 2 L)

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … Total cost now O(χ 2 L)

Perfect sampling of a unitary MPS Note that P(s 1,s 2,s 3,…) = P(s 1 ) P(s 2 |s 1 ) P(s 3 |s 1,s 2 ) … Total cost now O(χ 2 L)

Comparison: critical transverse Ising model Perfect samplingMarkov chain sampling Ferris & Vidal, PRB 85, (2012)

50 sites 250 sites Perfect sampling Markov chain MC Critical transverse Ising model Ferris & Vidal, PRB 85, (2012)

Multi-scale entanglement renormalization ansatz (MERA) Numerical implementation of real-space renormalization group – remove short-range entanglement – course-grain the lattice

Sampling the MERA Cost is O(χ 9 )

Sampling the MERA Cost is O(χ 5 )

Perfect sampling with MERA

Perfect Sampling with MERA Cost reduced from O(χ 9 ) to O(χ 5 ) Ferris & Vidal, PRB 85, (2012)

Extracting expectation values Transverse Ising model Worst case = - 2 Monte Carlo MERA

Optimizing tensors Environment of a tensor can be estimated Statistical noise  SVD updates unstable

Optimizing isometric tensors Each tensor must be isometric: Therefore can’t move in arbitrary direction – Derivative must be projected to the tangent space of isometric manifold: – Then we must insure the tensor remains isometric

Results: Finding ground states Transverse Ising model Samples per update Exact contraction result Ferris & Vidal, PRB 85, (2012)

Accuracy vs. number of samples Transverse Ising Model Samples per update Ferris & Vidal, PRB 85, (2012)

Discussion of performance Sampling the MERA is working well. Optimization with noise is challenging. New optimization techniques would be great – “Stochastic reconfiguration” is essentially the (imaginary) time-dependent variational principle (Haegeman et al.) used by VMC community. Relative performance of Monte Carlo in 2D systems should be more favorable.

Two-dimensional MERA 2D MERA contractions significantly more expensive than 1D E.g. O(χ 16 ) for exact contraction vs O(χ 8 ) per sample – Glen has new techniques… Power roughly halves – Removed half the TN diagram

Conclusions & Outlook Can effectively sample the MERA (and other unitary TN’s) Optimization is challenging, but possible! Monte Carlo should be more effective in 2D where there are more degrees of freedom to sample PRB 85, & (2012)