Presentation is loading. Please wait.

Presentation is loading. Please wait.

Feedback Networks and Associative Memories

Similar presentations


Presentation on theme: "Feedback Networks and Associative Memories"— Presentation transcript:

1 Feedback Networks and Associative Memories
虞台文 大同大學資工所 智慧型多媒體研究室

2 Content Introduction Discrete Hopfield NNs Continuous Hopfield NNs
Associative Memories Hopfield Memory Bidirection Memory

3 Feedback Networks and Associative Memories
Introduction 大同大學資工所 智慧型多媒體研究室

4 Feedforward/Feedback NNs
Feedforward NNs The connections between units do not form cycles. Usually produce a response to an input quickly. Most feedforward NNs can be trained using a wide variety of efficient algorithms. Feedback or recurrent NNs There are cycles in the connections. In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. Usually more difficult to train than feedforward NNs.

5 Supervised-Learning NNs
Feedforward NNs Perceptron Adaline, Madaline Backpropagation (BP) Artmap Learning Vector Quantization (LVQ) Probabilistic Neural Network (PNN) General Regression Neural Network (GRNN) Feedback or recurrent NNs Brain-State-in-a-Box (BSB) Fuzzy Congitive Map (FCM) Boltzmann Machine (BM) Backpropagation through time (BPTT)

6 Unsupervised-Learning NNs
Feedforward NNs Learning Matrix (LM) Sparse Distributed Associative Memory (SDM) Fuzzy Associative Memory (FAM) Counterprogation (CPN) Feedback or Recurrent NNs Binary Adaptive Resonance Theory (ART1) Analog Adaptive Resonance Theory (ART2, ART2a) Discrete Hopfield (DH) Continuous Hopfield (CH) Discrete Bidirectional Associative Memory (BAM) Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)

7 The Hopfield NNs In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. A fully connected, symmetrically weighted network where each node functions both as input and output node. Used for Associated memories Combinatorial optimization

8 Associative Memories An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. Two types of associative memory: autoassociative and heteroassociative. Auto-association retrieves a previously stored pattern that most closely resembles the current pattern. Hetero-association the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.

9 Associative Memories A A Auto-association Hetero-association memory
Niagara Waterfall

10 Optimization Problems
Associate costs with energy functions in Hopfield Networks Need to be in quadratic form Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set. Local optimums, not global.

11 Feedback Networks and Associative Memories
Discrete Hopfield NNs 大同大學資工所 智慧型多媒體研究室

12 The Discrete Hopfield NNs
w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

13 The Discrete Hopfield NNs
wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

14 The Discrete Hopfield NNs
wij = wij wii = 0 The Discrete Hopfield NNs w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 1 2 3 n . . . I1 I2 I3 In v1 v2 v3 vn

15 State Update Rule Asynchronous mode Update rule Stable?

16 Energy Function Fact:E is lower bounded (upper bounded). If E is monotonically decreasing (increasing), the system is stable.

17 The Proof Suppose that at time t + 1, the kth neuron is selected for update.

18 Their values are not changed at time t + 1.
The Proof Their values are not changed at time t + 1. Suppose that at time t + 1, the kth neuron is selected for update.

19 The Proof

20 E The Proof Stable vk(t) vk(t+1) Hk(t) E 1  0 1 1 < 0 1 < 0
1 < 0 1 < 0 1  0 1 < 0 1 < 0 1

21 Feedback Networks and Associative Memories
Continuous Hopfield NNs 大同大學資工所 智慧型多媒體研究室

22 The Neuron of Continuous Hopfield NNs
. gi Ci wi1 wi2 win Ii v1 v2 vn ui vi ui vi =a(ui) 1 1 vi ui =a1(vi) 1 1

23 The Dynamics . gi Ci wi1 wi2 win Ii v1 v2 vn ui vi Gi

24 The Continuous Hopfield NNs
g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .

25 The Continuous Hopfield NNs
g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable?

26 Equilibrium Points Consider the autonomous system:
Equilibrium Points Satisfy

27 Lyapunov Theorem Call E(y) as energy function.
The system is asymptotically stable if the following holds: There exists a positive-definite function E(y) s.t.

28 Lyapunov Energy Function
gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . .

29 Lyapunov Energy Function
u=a1(v) 1 1 ui vi =a(ui) 1 1 v 1 1 I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn

30 Stability of Continuous Hopfield NNs
Dynamics Stability of Continuous Hopfield NNs

31 Stability of Continuous Hopfield NNs
Dynamics Stability of Continuous Hopfield NNs > 0 v u=a1(v) 1 1

32 Stability of Continuous Hopfield NNs
g1 C1 I1 u1 g2 C2 I2 u2 g3 C3 I3 u3 gn Cn In un v1 v2 v3 vn w21 wn1 w31 w12 w13 w1n w23 w2n w32 w3n wn2 wn3 . . . Stable

33 Feedback Networks and Associative Memories
大同大學資工所 智慧型多媒體研究室

34 Associative Memories Also named content-addressable memory.
Autoassociative Memory Hopfield Memory Heteroassociative Memory Bidirection Associative Memory (BAM)

35 Associative Memories Stored Patterns Autoassociative Heteroassociative

36 Feedback Networks and Associative Memories
Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室

37 Hopfield Memory Fully connected 14,400 weights 1210 Neurons

38 Example Stored Patterns Memory Association

39 Example Stored Patterns Memory Association

40 Example How to Store Patterns? Stored Patterns Memory Association

41 =? The Storage Algorithm
Suppose the set of stored pattern is of dimension n. =?

42 The Storage Algorithm

43 Analysis Suppose that x  xi.

44 Example

45 Example 1 2 3 4 2

46 Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4

47 Example 1 2 3 4 2 1 1 E=4 1 1 Stable E=0 1 1 E=4

48 Problems of Hopfield Memory
Complement Memories Spurious stable states Capacity

49 Capacity of Hopfield Memory
The number of storable patterns w.r.t. the size of the network. Study methods: Experiments Probability Information theory Radius of attraction ()

50 Capacity of Hopfield Memory
The number of storable patterns w.r.t. the size of the network. Hopfield (1982) demonstrated that the maximum number of patterns that can be stored in the Hopfield model of n nodes before the error in the retrieved pattern becomes severe is around 0.15n.  The memory capacity of the Hopfield model can be increased as shown by Andrecut (1972).

51 Radius of attraction (0    1/2)
False memories

52 Feedback Networks and Associative Memories
Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室

53 . . . . . . Bidirection Memory W=[wij]nm y1 y2 yn x1 x2 xm Y Layer
Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer

54 ? . . . . . . Bidirection Memory W=[wij]nm Stored Patterns y1 y2 yn
Y Layer ? Forward Pass Backward Pass W=[wij]nm . . . x1 x2 xm X Layer

55 The Storage Algorithm Stored Patterns

56 Analysis Suppose xk’ is one of the stored vector. 0

57 Energy Function

58 Bidirection Memory Energy Function:


Download ppt "Feedback Networks and Associative Memories"

Similar presentations


Ads by Google