Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Chapter3 Pattern Association & Associative Memory
Pattern Association.
Feedback Networks and Associative Memories
Memristor in Learning Neural Networks
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex 7, (1997) N Brunel, Network 11, (2000) N Brunel, Cerebral.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
Tuomas Sandholm Carnegie Mellon University Computer Science Department
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Artificial Neurons: Hopfield Networks Seminar: Introduction to the Theory of Neural Computation Introduction Neurophysiological Background Modeling Simplified.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Computer Science and Engineering
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Hebbian Coincidence Learning
Recurrent Network InputsOutputs. Motivation Associative Memory Concept Time Series Processing – Forecasting of Time series – Classification Time series.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
B. Stochastic Neural Networks
Constraint Satisfaction and Schemata Psych 205. Goodness of Network States and their Probabilities Goodness of a network state How networks maximize goodness.
Concept Learning and Optimisation with Hopfield Networks Kevin Swingler University of Stirling Presentation to UKCI 2011, Manchester.
Ten MC Questions taken from the Text, slides and described in class presentation. COSC 4426 AJ Boulay Julia Johnson.
Boltzman Machines Stochastic Hopfield Machines Lectures 11e 1.
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
CSC321: Computation in Neural Networks Lecture 21: Stochastic Hopfield nets and simulated annealing Geoffrey Hinton.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
An Associative Memory based on a Mixed-Signal Cellular Neural Network Michael Flynn, Daniel Weyer.
Lecture 39 Hopfield Network
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSC321 Lecture 18: Hopfield nets and simulated annealing
Ch7: Hopfield Neural Model
Real Neurons Cell structures Cell body Dendrites Axon
Joost N. Kok Universiteit Leiden
The Hopfield Model - Jonathan Amazon.
Covariation Learning and Auto-Associative Memory
Corso su Sistemi complessi:
Neural Networks Chapter 5
Artificial Neural Networks
Neural Networks Chapter 4
Recurrent Networks A recurrent network is characterized by
Lecture 39 Hopfield Network
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden

Hopfield Networks Network of McCulloch-Pitts neurons Output is 1 iff and is -1 otherwise

Hopfield Networks

Associative Memory Problem: Store a set of patterns in such a way that when presented with a new pattern, the network responds by producing whichever of the stored patterns most closely resembles the new pattern.

Hopfield Networks Resembles = Hamming distance Configuration space = all possible states of the network Stored patterns should be attractors Basins of attractors

Hopfield Networks N neurons Two states: -1 (silent) and 1 (firing) Fully connected Symmetric Weights Thresholds

Hopfield Networks w 13 w 16 w 57 +1

Hopfield Networks State: Weights: Dynamics:

Hopfield Networks Hebb’s learning rule: –Make connection stronger if neurons have the same state –Make connection weaker if the neurons have a different state

Hopfield Networks neuron 1synapseneuron 2

Hopfield Networks Weight between neuron i and neuron j is given by

Hopfield Networks Opposite patterns give the same weights This implies that they are also stable points of the network Capacity of Hopfield Networks is limited: 0.14 N

Hopfield Networks Hopfield defines the energy of a network: E = - ½  ij S i S j w ij +  i S i  i If we pick unit i and the firing rule does not change its S i, it will not change E. If we pick unit i and the firing rule does change its S i, it will decrease E.

Hopfield Networks Energy function: Alternative Form: Updates:

Hopfield Networks

Extension: use stochastic fire rule – S i := +1 with probability g(h i ) – S i := -1 with probability 1-g(h i )

Hopfield Networks Nonlinear function: x g(x)g(x) g(x) = 1 + e – x  1        0

Hopfield Networks Replace the binary threshold units by binary stochastic units. Define  = 1/T Use “temperature” T to make it easier to cross energy barriers. –Start at high temperature where its easy to cross energy barriers. –Reduce slowly to low temperature where good states are much more probable than bad ones. A B C

Hopfield Networks Kick the network our of spurious local minima Equilibrium: becomes time independent