Download presentation
Presentation is loading. Please wait.
Published byEric Hamilton Modified over 9 years ago
1
IE 585 Associative Network
2
2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of pattern associations The net only only learns the specific pattern pairs that were used for training, but also is able to recall the desired response pattern when given an input stimulus that is similar, but not identical, to the training input
3
3 Types of Associative Memory Autoassociative memory: if each vector t (output, target) is the same as the vector s (input) with which it is associated Heteroassociative memory: if the t’s are different from the s’s.
4
4 Discrete Hopfield Net Developed by Hopfield (binary – 1982, bipolar – 1984) Recurrent autoassociative Fully interconnected neural network Symmetric weights with no self-connections Asychronous only one unit updates its activation at a time each unit continues to receive an external signal in addition to the signal from the other units in the net
5
5 Architecture of Hopfield Net x1x1 x2x2 Y1Y1.... xnxn Y2Y2 YnYn
6
6 Procedure of the Discrete Hopfield Net Initialize weights to store patterns For each input vector x set y i = x i for each unit Y i compute broadcast the y i to all other units Continue until it converges
7
7 Hebb Learning Binary Bipolar
8
8 Transfer Function Step Function Binary Transfer Function: Bipolar Transfer Function: 1 1 0
9
9 is usually 0 The order of update of the units is random The order of learning set does not affect Extension to continuous activation for both pattern association or constrained optimization by Hopfiled & Tank (1985, Hopfiled-Tank Net)
10
10 Hopfield Net Example
11
11 Energy Function for the discrete Hopfield Net Also called Lyapunov Function Developed by Alexander Lyaphnov (Russian) Asynchronous updating of the Hopfield net allows such a function Prove that the net will converge to a stable limit point (pattern of activation of the units) Non-increasing function of the state of the system
12
12 Storage Capacity of Hopfield Net Binary Bipolar P: # of patterns that can be stored an recalled in a net with reasonable accuracy n: # of neurons in the net
13
13 Spurious Memory It is stable in energy state, but converges to an activation vector that is not one of the stored patterns a discrete Hopfield net can be used to determine whether an input vector is a “known” vector or an “unknown” vector
14
14 Hamming Distance (HD) # of different bits in two binary or bipolar vectors Orthgonal HD = n/2 n: # of bits can store max # of patterns
15
15 Bi-directional Associative Memory (BAM) Developed by Kosko (1988) Recurrent heteroassociative Two layers of neurons connected by directional weighted connection paths Symmetric weights with no self-connections Signals are sent only from one layer to the other at any step of the process, not simultaneously in both directions
16
16 Architecture of BAM x1x1 x2x2 Y1Y1.... xnxn Y2Y2 YnYn....
17
17 Procedure of the BAM Initialize weights to store patterns Initialize all activations to 0 For each testing input vector present input x to the X-layer (or Y-layer) compute compute send signals to Y-layer Continue until it converges
18
18 Hebb Learning Binary Bipolar
19
19 Transfer Function Step Function Binary Transfer Function: Bipolar Transfer Function: 1 1 0
20
20 BAM Example
21
21 Storage Capacity of the BAM P: # of patterns that can be stored an recalled in a net with reasonable accuracy n: # of inputs m: # of outputs
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.