Quantum fields as deep learning

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Sergey Bravyi, IBM Watson Center Robert Raussendorf, Perimeter Institute Perugia July 16, 2007 Exactly solvable models of statistical physics: applications.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Support Vector Machines
 Entanglement & area thermodynamics of Rindler space  Entanglement & area  Entanglement & dimensional reduction (holography) Entanglement, thermodynamics.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
Entanglement in Quantum Critical Phenomena, Holography and Gravity Dmitri V. Fursaev Joint Institute for Nuclear Research Dubna, RUSSIA Banff, July 31,
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Quantum Entanglement and Gravity Dmitri V. Fursaev Joint Institute for Nuclear Research and Dubna University “Gravity in three dimensions”, ESI Workshop,
Entanglement Renormalization Frontiers in Quantum Nanoscience A Sir Mark Oliphant & PITP Conference Noosa, January 2006 Guifre Vidal The University of.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Efficient Quantum State Tomography using the MERA in 1D critical system Presenter : Jong Yeon Lee (Undergraduate, Caltech)
An Introduction to Support Vector Machines Martin Law.
July 11, 2001Daniel Whiteson Support Vector Machines: Get more Higgs out of your data Daniel Whiteson UC Berkeley.
Introduction to Tensor Network States Sukhwinder Singh Macquarie University (Sydney)
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Louisville March 22, 2006 Andrew Chamblin Memorial An AdS Thermal Properties of Strongly Coupled Gauge Theories with Fundamental Matter from Gauge/Gravity.
Einstein Field Equations and First Law of Thermodynamics Rong-Gen Cai (蔡荣根) Institute of Theoretical Physics Chinese Academy of Sciences.
Thermodynamics of Apparent Horizon & Dynamics of FRW Spacetime Rong-Gen Cai (蔡荣根) Institute of Theoretical Physics Chinese Academy of Sciences.
An Introduction to Support Vector Machines (M. Law)
Entanglement Entropy in Holographic Superconductor Phase Transitions Rong-Gen Cai Institute of Theoretical Physics Chinese Academy of Sciences (April 17,
AdS/CFT Correspondence and Entanglement Entropy Tadashi Takayanagi (Kyoto U.) Based on hep-th/ [Phys.Rev.Lett.96(2006)181602] hep-th/ [JHEP.
Tensor networks and the numerical study of quantum and classical systems on infinite lattices Román Orús School of Physical Sciences, The University of.
Emergent Space-Time and and Induced Gravity Erik Verlinde University of Amsterdam Madrid, November 17 th, 2006 Some (Speculative) Ideas on “Strings versus.
Comments on entanglement entropy in the dS/CFT correspondence Yoshiki Sato ( Kyoto U. ) PRD 91 (2015) 8, [arXiv: ] 9th July.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 18 Connectionist Models
Random codes and holographic duality QIP 2016 Patrick Hayden Stanford University A A With Sepehr Nezami, Xiao-Liang Qi, Nate Thomas, Michael Walter, Zhao.
Entanglement in Quantum Gravity and Space-Time Topology
Entanglement Entropy from AdS/CFT Tadashi Takayanagi (Kyoto Univ.) Based on hep-th/ , , , , arXiv: , , ,
Additional NN Models Reinforcement learning (RL) Basic ideas: –Supervised learning: (delta rule, BP) Samples (x, f(x)) to learn function f(.) precise error.
On String Theory Duals of Lifshitz-like Fixed Point Tatsuo Azeyanagi (Kyoto University) Based on work arXiv: (to appear in JHEP) with Wei Li (IPMU)
Chapter 6 Neural Network.
WHAT IS DATA MINING?  The process of automatically extracting useful information from large amounts of data.  Uses traditional data analysis techniques.
Giovanni Ramírez, Javier Rodríguez-Laguna, Germán Sierra Instituto de Física Teórica UAM-CSIC, Madrid Workshop “Entanglement in Strongly Correlated Systems”
Quantum mechanics and the geometry of spacetime Juan Maldacena PPCM Conference May 2014.
Quantum Boltzmann Machine
The Stereoscopic Holographic Dictionary JAMES SULLY B. Czech, L. Lamprou, S.McCandlish, B. Mosk, JS arXiv: B. Czech, L. Lamprou, S.McCandlish,
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Machine Learning Supervised Learning Classification and Regression
Quantum Mechanical Models for Near Extremal Black Holes
Learning Deep Generative Models by Ruslan Salakhutdinov
Deep Feedforward Networks
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
Thermodynamic Volume in AdS/CFT
Gravity from quantum entanglement Jae-Weon Lee (Jungwon univ.)
Quantum Information and Everything.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Estimating Link Signatures with Machine Learning Algorithms
Solutions of black hole interior, information paradox and the shape of singularities Haolin Lu.
Structure learning with deep autoencoders
Restricted Boltzman Machines
A brief introduction to neural network
Department of Electrical and Computer Engineering
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
3rd Lecture: QMA & The local Hamiltonian problem (CNT’D)
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Gravity from Entanglement and RG Flow
Deep Learning for Non-Linear Control
Quantum computation with classical bits
Machine Learning Support Vector Machine Supervised Learning
Tensor Network Simulations of QFT in Curved Spacetime
Computational approaches for quantum many-body systems
Computational approaches for quantum many-body systems
CSC 578 Neural Networks and Deep Learning
An introduction to neural network and machine learning
Presentation transcript:

Quantum fields as deep learning Jae-Weon Lee (Jungwon univ.) arXiv:1708.07408

Brief History of Complexity 1920: Hilbert's program Decision problem=asking for an algorithm to decide whether a given statement is provable from the axioms using the rules of logic 1931:Kurt Gödel's incompleteness theorem 1936:Turing machine  computational complexity theory Easy problem = in P, Hard problem = not in P Church–Turing Thesis = “All physically computable functions are Turing-computable” 1981: Feynman proposes Q. Computer (N spins need 2N coefficients) Quantum Church–Turing thesis =“A quantum Turing machine can efficiently simulate any realistic model of computation.” Church–Turing -Dutch principle  The universe is equivalent to a Turing machine digital physics.

Machine learning “A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E” -Tom M. Mitchell (x) Supervised Learning (data x-label y) Classification Regression Unsupervised Learning (data x) Clustering Underlying Probability Density Estimation Semisupervised learning Reinforcement Learning – delayed reward problem 0 6 4 7 (y) (x) Algorithms: Gradient descent, Regression, Bayesian, HMM, SVM, ANN

Artificial neural networks (ANN) O(1011) cells with O(104) synapses 가중치 역치 Activation function perceptron 1958 Rosenblatt

How DLs recognize a dog Roger Parloff

WHY DOES DEEP LEARNING WORK? RG? Combination? Simple physical laws arXiv:1410.3831 Pankaj Mehta, David J. Schwab Lin, Tegmark 1608.08225

Hopfield network (energy based, deterministic) Training a Hopfield net = lowering the energy of states that the net should "remember" 1 3 1)wij = wji, 2) asynchronous  associative memory

Boltzmann machine (BM)  Stochastic(Monte Carlo), generative counterpart of Hopfield nets Similar to Spin glass Unsupervised learning impractical

Restricted Boltzmann machine (RBM) no two nodes of the same layer are linked interaction

차원감소 Autoencoder

Renormalization Kadanoff h v Boltzmann distribution Variational RG

An exact mapping between the Variational Renormalization Group and Deep Learning arXiv:1410.3831 Pankaj Mehta, David J. Schwab

An exact mapping between the Variational Renormalization Group and Deep Learning

AdS/MERA gA CFTd A AdSd+1 Ryu-Takayanagi formula 2006 s steps L~ 2s Swingle 2009 Ryu-Takayanagi formula 2006 A gA CFTd AdSd+1 s steps L~ 2s E.E ~ # of tracing out indices ~# of cuts ~ s ~ log(L)  geodesic in AdS  Redundancy, QEC, Holography…

Holography as deep learning Wen-Cong Ganand Fu-Wen Shu, 1705.05750

Gravity from entanglement Lee, Lee, Kim, 1001.5445 JKPS 2013 Vacuum in Euclidean L R t x dE Boost H. a with proper time Unruh effect BH horizon too

Quantum fields as deep RBM arXiv:1708.07408 Higgs?

Conclusion Deep learning has physics in it Quantum fields, and spacetime, could be deep learning [Q] Can nature think? If yes, then why doesn’t it speak to us??

Machine Learning Spatial Geometry from Entanglement Hayden etal 1601.01694 the random tensor network (RTN) states satisfy the Ryu-Takayanagi formula Yi-Zhuang You etal 1709.01223

Other ideas 1) Euclidean quantum fieeld theory in d+1 dimensional at spacetime and the statistical mechanics in d+1 dimensional at space using an imaginary time 2) QNN

Entanglement entropy of a state How broadly spread over, or Information in a density matrix A Ex) Vacuum or ground Partial trace For pure states Entanglement Entropy (vNE of subsystem) More correlated, less we know about the subsystem alone.

Vanishing gradient problem Multi-layer perceptron 기울기<1/4 출력 입력 Backpropagation Backward:Hinton et al. (1986) Nature Using chainrule

Tensor network Orus arXiv:1306.2164 Tensor network: to find the ground states of critical quantum systems in an efficient manner. (MPS, PEPS, MERA…) (Feynman’s original motivation for QC.) Ex) N spin wave fn ansatz Ground state Vidal

aXiv:1112.4101

MERA ( Multiscale entanglement renormalization ansatz) Vidal 2007 time RG Removal UV DOF (preserves the inner product) Removal UV entanglement

Jacobson’s Great idea Padmanabhan 1st law (assumption) R Covariant form where Raychaudhuri equation For all null using contracted Bianchi identity Einstein equation is related to 1st law for local Rindler horizons!

Entanglement 1st law

Holographic Q. error correcting code HaPPY Perfect tensor with 2n indices: Isometry Maximally entangled n to n 2n-1 qubit EC for 1 qubit The code can tolerate loss of 2 qubit 5 qubit code Logical op. preserving code space reside in the bulk Geodesic line = epr pairs Quantum code on boundary

Activation function needs to be nonlinear (Logistic) 미분에 유리 (Logistic) Rectified linear unit Moujahid

Entanglement as spacetime glue Raamsdonk 1005.3035 A B L Minimal surface The smaller entanglement, the less ST connected gA CFT CFT Two CFTs in HH states =dual to eternal AdS black hole spacetime (EE is equal to thermal entropy of BH) Quantum superposition of disconnected spacetime  Emergent spacetime Maldacena, hepth/0106112

Modern interpretation L R t x a