Feedback Networks and Associative Memories 虞台文 大同大學資工所 智慧型多媒體研究室
Content Introduction Discrete Hopfield NNs Continuous Hopfield NNs Associative Memories Hopfield Memory Bidirection Memory
Feedback Networks and Associative Memories Introduction 大同大學資工所 智慧型多媒體研究室
Feedforward/Feedback NNs Feedforward NNs The connections between units do not form cycles. Usually produce a response to an input quickly. Most feedforward NNs can be trained using a wide variety of efficient algorithms. Feedback or recurrent NNs There are cycles in the connections. In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. Usually more difficult to train than feedforward NNs.
Supervised-Learning NNs Feedforward NNs Perceptron Adaline, Madaline Backpropagation (BP) Artmap Learning Vector Quantization (LVQ) Probabilistic Neural Network (PNN) General Regression Neural Network (GRNN) Feedback or recurrent NNs Brain-State-in-a-Box (BSB) Fuzzy Congitive Map (FCM) Boltzmann Machine (BM) Backpropagation through time (BPTT)
Unsupervised-Learning NNs Feedforward NNs Learning Matrix (LM) Sparse Distributed Associative Memory (SDM) Fuzzy Associative Memory (FAM) Counterprogation (CPN) Feedback or Recurrent NNs Binary Adaptive Resonance Theory (ART1) Analog Adaptive Resonance Theory (ART2, ART2a) Discrete Hopfield (DH) Continuous Hopfield (CH) Discrete Bidirectional Associative Memory (BAM) Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)
The Hopfield NNs In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. A fully connected, symmetrically weighted network where each node functions both as input and output node. Used for Associated memories Combinatorial optimization
Associative Memories An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. Two types of associative memory: autoassociative and heteroassociative. Auto-association retrieves a previously stored pattern that most closely resembles the current pattern. Hetero-association the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.
Associative Memories A A Auto-association Hetero-association memory Niagara Waterfall
Optimization Problems Associate costs with energy functions in Hopfield Networks Need to be in quadratic form Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set. Local optimums, not global.
Feedback Networks and Associative Memories Discrete Hopfield NNs 大同大學資工所 智慧型多媒體研究室
The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn
The Discrete Hopfield NNs wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn
The Discrete Hopfield NNs wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn
State Update Rule Asynchronous mode Update rule Stable?
Energy Function Fact:E is lower bounded (upper bounded). If E is monotonically decreasing (increasing), the system is stable.
The Proof Suppose that at time t + 1, the kth neuron is selected for update.
Their values are not changed at time t + 1. The Proof Their values are not changed at time t + 1. Suppose that at time t + 1, the kth neuron is selected for update.
The Proof
E The Proof Stable vk(t) vk(t+1) Hk(t) E 1 0 1 1 < 0 1 < 0 1 < 0 1 < 0 1 0 1 < 0 1 < 0 1
Feedback Networks and Associative Memories Continuous Hopfield NNs 大同大學資工所 智慧型多媒體研究室
The Neuron of Continuous Hopfield NNs . gi Ci wi1 wi2 win Ii v1 v2 vn ui vi ui vi =a(ui) 1 1 vi ui =a1(vi) 1 1
The Dynamics . gi Ci wi1 wi2 win Ii v1 v2 vn ui vi Gi
The Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .
The Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable?
Equilibrium Points Consider the autonomous system: Equilibrium Points Satisfy
Lyapunov Theorem Call E(y) as energy function. The system is asymptotically stable if the following holds: There exists a positive-definite function E(y) s.t.
Lyapunov Energy Function gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .
Lyapunov Energy Function u=a1(v) 1 1 ui vi =a(ui) 1 1 v 1 1 I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn
Stability of Continuous Hopfield NNs Dynamics Stability of Continuous Hopfield NNs
Stability of Continuous Hopfield NNs Dynamics Stability of Continuous Hopfield NNs > 0 v u=a1(v) 1 1
Stability of Continuous Hopfield NNs g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable
Feedback Networks and Associative Memories 大同大學資工所 智慧型多媒體研究室
Associative Memories Also named content-addressable memory. Autoassociative Memory Hopfield Memory Heteroassociative Memory Bidirection Associative Memory (BAM)
Associative Memories Stored Patterns Autoassociative Heteroassociative
Feedback Networks and Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室
Hopfield Memory Fully connected 14,400 weights 1210 Neurons
Example Stored Patterns Memory Association
Example Stored Patterns Memory Association
Example How to Store Patterns? Stored Patterns Memory Association
=? The Storage Algorithm Suppose the set of stored pattern is of dimension n. =?
The Storage Algorithm
Analysis Suppose that x xi.
Example
Example 1 2 3 4 2
Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4
Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4
Problems of Hopfield Memory Complement Memories Spurious stable states Capacity
Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Study methods: Experiments Probability Information theory Radius of attraction ()
Capacity of Hopfield Memory The number of storable patterns w.r.t. the size of the network. Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n. The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).
Radius of attraction (0 1/2) False memories
Feedback Networks and Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室
. . . . . . Bidirection Memory W=[wij]nm y1 y2 yn x1 x2 xm Y Layer Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer
? . . . . . . Bidirection Memory W=[wij]nm Stored Patterns y1 y2 yn Y Layer ? Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer
The Storage Algorithm Stored Patterns
Analysis Suppose xk’ is one of the stored vector. 0
Energy Function
Bidirection Memory Energy Function: