Download presentation
Presentation is loading. Please wait.
Published byDerick Carson Modified over 9 years ago
1
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授
2
朝陽科技大學 李麗華 教授 2 Assoicative Memory (AM) -1 Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns. Two types of AM: –Auto-associative Memory: Converts a corrupted input pattern into the most resembled input. –Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.
3
朝陽科技大學 李麗華 教授 3 Models: It is the associative mapping of an input vector X into the output vector V. EX: Hopfield Neural Network (HNN) EX: Bidirectional Associative Memory (BAM) Assoicative Memory (AM) - 2 Assoicative Memory X1X2X3:XnX1X2X3:Xn v1v2v3:vmv1v2v3:vm
4
朝陽科技大學 李麗華 教授 4 Introduction Hopfield Neural Network(HNN) was proposed by Hopfield in 1982. HNN is an auto-associative memory network. It is a one layer, fully connected network. X1X1 X2X2 XnXn … …
5
朝陽科技大學 李麗華 教授 5 HNN Architecture Input : X i ﹛ -1, +1 ﹜ Output : same as input( ∵ single layer network) Transfer function : X i new = Weights : Connections : +1 net j > 0 X i if net j = 0 -1 net j < 0 X1X1 X2X2 XnXn … … ( X i 是指前一個 X 值 )
6
朝陽科技大學 李麗華 教授 6 HNN Learning Process Learning Process : a. Setup the network, i.e., design the input nodes & connections. b. Calculate and derived the weight matrix C. Store the weight matrix. The learning process is done when the weight matrix is derived. We shall obtain a nxn weight matrix, W nxn.
7
朝陽科技大學 李麗華 教授 7 HNN Recall Process Recall a. Read the nxn weight matrix, W nxn. b. Input the test pattern X for recalling. c. Compute new input ( i.e. output ) d. Repeat process c. until the network converge ( i.e. the net value is not changed or the error is very small ) +1 net j > 0 X j old if net j = 0 +1 net j < 0 X j : ( or net = W ‧ X i ) X new
8
朝陽科技大學 李麗華 教授 8 Example: Use HNN to memorize patterns (1) Use HNN to memorize the following patterns. Let the Green color is represented by “1” and white color is represented by “-1”. The input data is as shown in the table PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X 1 11 1 X2X2 1 1 1 X3X3 111111 X4X4 X1X1 X2X2 X3X3 X4X4
9
朝陽科技大學 李麗華 教授 9 W ii =0 PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X 1 11 1 X2X2 1 1 1 X3X3 111111 X4X4 Example: Use HNN to memorize patterns (2)
10
朝陽科技大學 李麗華 教授 10 Recall Example: Use HNN to memorize patterns (3) The pattern is recalled as:
11
-Artificial Neural Network- Bidirectional Associative Memory (BAM) 朝陽科技大學 資訊管理系 李麗華 教授
12
朝陽科技大學 李麗華 教授 12 Introduction Bidirectional Associative Memory (BAM) was proposed by Bart Kosko in 1985. It is a hetro-associative memory network. It allows the network to memorize from a set of pattern X p to recall another set of pattern Y p Y1Y1 Y2Y2 YmYm ‧‧‧‧‧‧ ‧‧‧‧‧‧‧
13
朝陽科技大學 李麗華 教授 13 Assoicative Memory (AM) 1 Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns. Two types of AM: –Auto-associative Memory: Converts a corrupted input pattern into the most resembled input. –Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.
14
朝陽科技大學 李麗華 教授 14 Models: It is the associative mapping of an input vector X into the output vector V. EX: Hopfield Neural Network (HNN) EX: Bidirectional Associative Memory (BAM) Assoicative Memory (AM) 2 Assoicative Memory X1X2X3:XnX1X2X3:Xn v1v2v3:vmv1v2v3:vm
15
朝陽科技大學 李麗華 教授 15 BAM Architecture Input layer : Output layer : Weights : Connection : Y1Y1 Y2Y2 YmYm ‧‧‧‧‧‧ ‧‧‧‧‧‧‧ It’s a 2-layer, fully connected, feed forward & feed back network.
16
朝陽科技大學 李麗華 教授 16 BAM Architecture (cont.)
Transfer function :
17
朝陽科技大學 李麗華 教授 17 BAM Example(1/4) Test pattern Y1Y1 Y 2 Y3Y3 Y4Y4 11 1 1 1 1 1 1 1 1 111111 1-1 1111 ●●● ○●○ ● ○●○● ○● ○●○● ○ ○●○●○●○●○●○● ● ●● ○ ○○
18
朝陽科技大學 李麗華 教授 18 BAM Example(2/4) 1. Learning –Set up network –Setup weights Y1Y1 Y 2 Y3Y3 Y4Y4 11 1 1 1 1 1 1 1 1 111111 1-1 1111
19
朝陽科技大學 李麗華 教授 19 BAM Example(3/4) 2. Recall Read network weights Read test pattern Compute Y Compute X
Repeat (3) & (4) until converge
20
朝陽科技大學 李麗華 教授 20 BAM Example(4/4) 聚類之 Application (1) (2) 二次都相同 test pattern (1 1 1 -1 1 -1) 1*6 ●●● ○●○
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.