The Hopfield Model - Jonathan Amazon.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
1 Spin Freezing in Geometrically Frustrated Antiferromagnets with Weak Bond Disorder Tim Saunders Supervisor: John Chalker.
Kostas Kontogiannis E&CE
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Geometric Frustration in Large Arrays of Coupled Lasers Near Field Far Field Micha Nixon Eitan Ronen, Moti Fridman, Amit Godel, Asher Friesem and Nir Davidson.
Presentation in course Advanced Solid State Physics By Michael Heß
Monte Carlo Simulation of Ising Model and Phase Transition Studies By Gelman Evgenii.
Relating computational and physical complexity Computational complexity: How the number of computational steps needed to solve a problem scales with problem.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Spin Glasses and Complexity: Lecture 2 Brief review of yesterday’s lecture Brief review of yesterday’s lecture Spin glass energy and broken symmetry Spin.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Outline Review of extended ensemble methods (multi-canonical, Wang-Landau, flat-histogram, simulated tempering) Replica MC Connection to parallel tempering.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Percolation in self-similar networks Dmitri Krioukov CAIDA/UCSD M. Á. Serrano, M. Boguñá UNT, March 2011.
1 Worm Algorithms Jian-Sheng Wang National University of Singapore.
Dynamical network motifs: building blocks of complex dynamics in biological networks Valentin Zhigulin Department of Physics, Caltech, and Institute for.
Hebbian Coincidence Learning
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
The Ising Model Mathematical Biology Lecture 5 James A. Glazier (Partially Based on Koonin and Meredith, Computational Physics, Chapter 8)
自旋玻璃与消息传递算法 Spin Glass and Message-Passing Algorithms 周海军 中国科学院理论物理研究所.
13. Extended Ensemble Methods. Slow Dynamics at First- Order Phase Transition At first-order phase transition, the longest time scale is controlled by.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Boltzman Machines Stochastic Hopfield Machines Lectures 11e 1.
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
CS623: Introduction to Computing with Neural Nets (lecture-17) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
Networks interacting with matter Bartlomiej Waclaw Jagellonian University, Poland and Universität Leipzig, Germany.
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
Simulations and Normal Distribution Week 4. Simulations Probability Exploration Tool.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Regression.
Computational Physics (Lecture 10)
CSC321 Lecture 18: Hopfield nets and simulated annealing
Neural Networks Winter-Spring 2014
Data Mining, Neural Network and Genetic Programming
Sampling Distributions
Artificial Intelligence (CS 370D)
CSSE463: Image Recognition Day 17
WK8 – Hopfield Networks CS 476: Networks of Neural Computation
Simple learning in connectionist networks
Arealization and Memory in the Cortex
Generalized Spatial Dirichlet Process Models
Corso su Sistemi complessi:
Presented by Rhee, Je-Keun
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
QM2 Concept test 8.1 We have three non-interacting particles in a one-dimensional infinite square well. The energy of the three particle system is
Optimal Degrees of Synaptic Connectivity
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Recurrent Networks A recurrent network is characterized by
CSSE463: Image Recognition Day 17
Artificial neurons Nisheeth 10th January 2019.
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Simple learning in connectionist networks
CSSE463: Image Recognition Day 17
Landscapes of the brain and mind
QM2 Concept test 8.1 We have three non-interacting particles in a one-dimensional infinite square well. The energy of the three particle system is
Models of the brain hardware
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

The Hopfield Model - Jonathan Amazon

Neural Network

Neural Network Can be modeled as a spin glass. Each neuron is a spin. Can be in an excited state (s = +1) or a quiet state (s = -1) The synapses between neurons are spin couplings. Can be excitory (ferromagnetic, J > 0) or inhibitory (anti-ferromagnetic, J < 0) Experimentally observed: Neurons spend most time in quiet state due to activation threshold. External field (H < 0) captures this behavior

Spin Glass Ising model with non-uniform coupling strength. Couplings are usually quenched variables drawn from distribution.

Hopfield Network Spin glass neural network Completely connected Couplings are not chosen from a distribution Pre-defined memory states are encoded into the coupling strengths. Hebbian rule fixes couplings. Memory states become minimal energy configurations (mostly). Gives network associative memory properties. Memory states are randomly generated by uniform probability of up or down spin.

Dynamics Method of Decent. Calculate local field from all other spins Compare to activation threshold. H = 0 for my simulation, Implies inversion symmetry of hamiltonian. Flip accordingly. Total energy is monotonically decreasing and system tends to a local energy well.

Associative Memory Relaxation from arbitrary starting state to nearest energy minimum. Hebbian rule: Local minimum will be memory state most closely resembling starting state. Or its inverse (two fold degeneracy). Memory capacity Extensively measured as p/N (memory density). Critical Memory threshold above which your 'brain explodes' How much is too much?

Thermodynamic Limit Percent of misaligned spins D. Amit H. Gutfreund H Sompolinsky Memory Density Critical memory density at p/N ~ 0.138. All energy minima are null correlated with desired memory states.

Memory Reliability Testing reliability of memory storage. Initialize in pure state. Relax lattice to ground state. All or nothing. Does relaxed state match initial state perfectly? Measures the percentage of times the lattice successfully retained the memory state.

Percent chance of recovering pure state Memory Density

Memory Degradation Testing how memory degrades as memory density increases. CASE1: Start in pure state. Relax network and record percent of spins that differ from initial state. CASE2: Start in random state. Relax network and determine closest pure state (prone to bias when null correlated). Record percent of spins that differ from closest memory state.

FINITE SIZE EFFECTS? SAMPLING BIAS? Percent of misaligned spins Memory Density

Percent of misaligned spins Memory Density

Applications Facial recognition (secutiry cameras, digital cameras...) Hand writing recognition (scanners, LateX help...) Numerical/graphical operations