Download presentation
Presentation is loading. Please wait.
Published byDale Heath Modified over 9 years ago
1
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops has a profound impact on the learning capability of the network. After applying a new input, the network output is calculated and fed back to adjust the input. This process is repeated until the outcome becomes constant. John Hopfield (1982) –Associative Memory via artificial neural networks –Optimisation
2
Neural Networks and Fuzzy Systems Hopfield Network 1 2 i n x 10 x 20 x i0 x n0............ y1y1 y2y2 yiyi ynyn............ w 11 w 21 w 41 w 31 w 12 w 22 w 42 w 32 w 13 w 23 w 43 w 33 w 14 w 24 w 44 w 34 It is a dynamic system: x(0)→y(0) →x(1) →y(1)…. →y *
3
Neural Networks and Fuzzy Systems Attractor If a state x(t) in a region S and t → ∞,, x(t) → x * S is the attractive region. if x * =desired state, x * is an attractor if x * ≠desired state, x * is a spurious attractor. S x(t) x*
4
Neural Networks and Fuzzy Systems Associative memory Nature of associative memory – part of information given – the rest of the pattern is recalled Hopfiled networks can be used as associative memory. –Design weight W so that X * =the memorised patterns. –Can store more than one. Capacity increases x(0) x*
5
Neural Networks and Fuzzy Systems Analogy with Optimisation The location of bottom of the bowl (X 0 ) represents the stored pattern Ball ’ s initial position represents the partial knowledge In corrugated surface, we can store {X 1, X 2, …, X n } as memories, and recall one which is closest to the initial state. Hopfield networks can also be used for optimisation: 1) Defining an energy E such that an attractor can minimise E 2) Difference from associative memory: expecting one/less attractor but large attractive region
6
Neural Networks and Fuzzy Systems Two Types of Associative Memory Autoassociative memory Pattern I i : I i +Δ→I i Heteroassociative memory Pattern pairs I i → y i : I i +Δ→y i –Hopfield networks are used as autoassociative memory
7
Neural Networks and Fuzzy Systems Hebbian Rule Original rule proposed by Hebb (1949) The organization behavior When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes parts in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased. That is, the correlation of activity between two cells is reinforced by increasing the synaptic strength between them.
8
Neural Networks and Fuzzy Systems Hebbian Rule If a neuron α and a neuronβ are “on” at the same time, their synaptic connection is strengthened. The next time some of them are activated they will activate each other. α β
9
Neural Networks and Fuzzy Systems Hebbian Rule In other words: 1. If two neurons on either side of a synapse (connection) are activated simultaneously (i.e. synchronously), then the strength of that synapse is selectively increased. Activation↑ This rule is often supplemented by: 2. If two neurons on either side of a synapse are activated asynchronously, then that synapse is selectively weakened or eliminated. Activation↓ Δw ij = y i x j
10
Neural Networks and Fuzzy Systems Synaptic weights in Hopfield Networks m Patterns: add individual weights together In matrix form, it is the outer product of the patterns: (to avoid self-feedback) I is n×n identity matrix, Superscript T denotes a matrix transpose.
11
Neural Networks and Fuzzy Systems Associative Memory of Hopfield Networks If V 1, …,V m are orthogonal, i.e. V iT V j =0 for i≠j then V l →V l, l=1..m, if n>m. If V 1, …,V m are not orthogonal Interference from other patterns, should be weaker than (n-m)
12
Neural Networks and Fuzzy Systems Storage Capacity As the number of patterns (m) increases, the chances of accurate storage must decrease Hopfield ’ s empirical work in 1982 – About half of the memories were stored accurately in a net of N nodes if m = 0.15N McCliece ’ s analysis in 1987 – If we require almost all the required memories to be stored accurately, then the maximum number of patterns m is N/(2lnN) – For N = 100, m = 11
13
Neural Networks and Fuzzy Systems Limitations of Hopfield Net The number of patterns that can be stored and accurately recalled is severely limited – If too many patterns are stored, net may converge to a novel spurious pattern : not matched output Exemplar pattern will be unstable if it shares many bits in common with another exemplar pattern
14
Neural Networks and Fuzzy Systems Example: Location Recall Front of a door: Ultrasonic sensors: V1=[1,1,1] Into a door: Ultrasonic sensors: V1=[-1,-1,-1]
15
Neural Networks and Fuzzy Systems An example of memorization Memorize the two states, (1,1,1) and (-1,-1,-1). Transposed form of these vectors: The 3 x 3 identity matrix is:
16
Neural Networks and Fuzzy Systems Example Cont’d… The weight matrix is determined as follows: So, Next, the network is tested by the sequence of input vectors X 1 and X 2, which are equal to the output (or target) vectors Y 1 and Y 2, respectively.
17
Neural Networks and Fuzzy Systems Example Cont’d…Network is tested. First activate the network by applying input vector X. Then, calculate the actual output vector Y, and finally, compare the result with the initial input vector X. Assume all thresholds to be zero for this example. Thus, Y 1 =X 1 and Y 2 =X 2, so both states, (1,1,1) and (-1,-1,-1). are said to be stable.
18
Neural Networks and Fuzzy Systems Example Cont’d…Other Possible States Possible state Iteration x 1 x 2 x 3 y 1 y 2 y 3 Fundamental mem. 1 1 10 1 1 1 1 1 1 1 1 1 -1 1 10-1 1 1 1 1 1 1 1 1 1 1 1 -1 10 1 -1 1 1 1 1 1 1 1 1 1 1 1 -1 0 1 1 -1 1 1 1 1 1 1 1 1 -1 -1 -10-1 -1 -1 -1 -1 -1-1 -1 -1 -1 -1 10-1 -1 1 -1 -1 -1 1-1 -1 -1 -1 -1 -1-1 -1 -1 -1 1 -10-1 1 -1 -1 -1 -1 1-1 -1 -1 -1 -1 -1-1 -1 -1 1 -1 -10 1 -1 -1 -1 -1 -1 1-1 -1 -1 -1 -1 -1-1 -1 -1 Inputs Outputs
19
Neural Networks and Fuzzy Systems Example Cont’d…Error compared to fundamental memory The fundamental memory (1,1,1) attracts unstable states (-1,1,1), (1,-1,1) and (1,1,-1). The fundamental memory (-1,-1,-1) attracts unstable states (-1,-1,1), (-1,1,- 1) and (1,-1,-1).
20
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm Step 1: Storage The n-neuron Hopfield network is required to store a set of M fundamental memories, Y 1, Y 2,… Y M. The synaptic weight from neuron i to neuron j is calculated as where y m,i and y m,j are the ith and jth elements of the fundamental memory Y m, respectively.
21
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm In matrix form, the synaptic weights between neurons are represented as The Hopfiled network can store a set of fundamental memories if the weight matrix is symmetrical, with zeros in its main diagonal. Where w ij = w ji. Once the weights are calculated, they remain fixed.
22
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm Step 2: Testing The network must recall any fundamental memory Y m when presented with it as an input. where y m,i is the ith element of the actual vector Y m, and x m,j is the jth element of the input vector X m. In matrix form,
23
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm Step 3: Retrieval (If all fundamental memories are called perfectly proceed to this step.) Present an unknown n-dimensional vector(probe), X, to the network and retrieve a stable state. That is, a)Initialize the retrieval algorithm of the Hopfield network by setting and calculate the initial state for each neuron
24
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm Step 3: Retrieval where x j (0) is the jth element of the probe vector X at iteration p=0, and y j (0) is the state of neuron i at iteration p=0. In matrix form, the state vector at iteration p=0 is presented as b)Update the elements of the state vector, Y(p), according to the following rule:
25
Neural Networks and Fuzzy Systems Hopfield Network Training Algorithm Step 3: Retrieval Weights for updating are selected asynchronously, that is, randomly and one at a time. Repeat the iteration until the state vector becomes unchanged, or in other words, a stable state is reached. It can be proved: The Hopfield network will always converge to a stable state when the retrieval operation is performed asynchronously, if w ij =w ji, and w ii =0. A stable state or fixed point:
26
Neural Networks and Fuzzy Systems Little model of Hopfield Network Little model uses synchronous dynamics for retrival(Little and Shaw, 1975): It can be proved: The Little Model will always converge to a stable state or a limit cycle of length at most 2 if w ij =w ji. It is very easy to be implemented by using matrix manipulation, such as in Matlab.
27
Neural Networks and Fuzzy Systems Little model of Hopfield Network Example: A stable state: Converge to the stable state: Limit Cycle 2:
28
Neural Networks and Fuzzy Systems Hopfield network as a model for associative memory Associative memory –Associates different features with each other Karen green George red Paul blue –Recall with partial cues
29
Neural Networks and Fuzzy Systems Neural Network Model of associative memory Neurons are arranged like a grid:
30
Neural Networks and Fuzzy Systems Setting the weights Each pattern can be denoted by a vector of -1s or 1s: If the number of patterns is m then: Hebbian Learning: –The neurons that fire together, wire together
31
Neural Networks and Fuzzy Systems Learning in Hopfield net
32
Neural Networks and Fuzzy Systems Summary Associative memory Discrete Hopfield Neural Networks Hebbian Learning Rule Readings Picton ’s book: Haykin’s book: pp.289-308 Blackboad readings
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.