-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

Pattern Association.
Presentation By Utkarsh Trivedi Y8544
Memristor in Learning Neural Networks
NEURAL NETWORKS Backpropagation Algorithm
-Artificial Neural Network- Chapter 2 Basic Model
-Artificial Neural Network- Counter Propagation Network
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Review of Chapter 3 - 已學過的 rules( 回顧 )- 朝陽科技大學 資訊管理系 李麗華 教授.
Neural network (II) — HNN Hopfield Neural Network Date : 2002/09/24 Present by John Chen
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華 教授.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
- Calculus & It’s Application- Chapter 2 Introduction to Limits 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Adaptive Resonance Theory(ART) 朝陽科技大學 資訊管理系 李麗華 教授.
-Antidifferentiation- Chapter 6 朝陽科技大學 資訊管理系 李麗華 教授.
-Artificial Neural Network- Chapter 4 朝陽科技大學 資訊管理系 李麗華 教授.
Data Mining with Neural Networks (HK: Chapter 7.5)
-Artificial Neural Network- Chapter 5 Back Propagation Network
CHAPTER 3 Pattern Association.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
-Artificial Neural Network- Matlab操作介紹 -以類神經網路BPN Model為例
朝陽科技大學 資訊科技研究所 流程改善方法 第一次報告 指 導 教 授:陳 隆 昇 學 生:李 延 浚 ( )
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Chapter 6 Associative Models. Introduction Associating patterns which are –similar, –contrary, –in close proximity (spatial), –in close succession (temporal)
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
-Artificial Neural Network- Chapter 9 Self Organization Map(SOM) 朝陽科技大學 資訊管理系 李麗華 教授.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24,
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Chapter 6 Neural Network.
Lecture 9 Model of Hopfield
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Ch7: Hopfield Neural Model
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Dr. Unnikrishnan P.C. Professor, EEE
Prof. Carolina Ruiz Department of Computer Science
XOR problem Input 2 Input 1
-Artificial Neural Network- Perceptron (感知器)
Ch6: AM and BAM 6.1 Introduction AM: Associative Memory
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CS623: Introduction to Computing with Neural Nets (lecture-11)
AI Lectures by Engr.Q.Zia
CSC 578 Neural Networks and Deep Learning
Principles of Back-Propagation
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授

朝陽科技大學 李麗華 教授 2 Assoicative Memory (AM) -1 Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns. Two types of AM: –Auto-associative Memory: Converts a corrupted input pattern into the most resembled input. –Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.

朝陽科技大學 李麗華 教授 3 Models: It is the associative mapping of an input vector X into the output vector V. EX: Hopfield Neural Network (HNN) EX: Bidirectional Associative Memory (BAM) Assoicative Memory (AM) - 2 Assoicative Memory X1X2X3:XnX1X2X3:Xn v1v2v3:vmv1v2v3:vm

朝陽科技大學 李麗華 教授 4 Introduction Hopfield Neural Network(HNN) was proposed by Hopfield in HNN is an auto-associative memory network. It is a one layer, fully connected network. X1X1 X2X2 XnXn … …

朝陽科技大學 李麗華 教授 5 HNN Architecture Input :  X i  ﹛ -1, +1 ﹜ Output : same as input( ∵ single layer network) Transfer function : X i new = Weights : Connections : +1 net j > 0 X i if net j = 0 -1 net j < 0 X1X1 X2X2 XnXn … … ( X i 是指前一個 X 值 )

朝陽科技大學 李麗華 教授 6 HNN Learning Process Learning Process : a. Setup the network, i.e., design the input nodes & connections. b. Calculate and derived the weight matrix C. Store the weight matrix. The learning process is done when the weight matrix is derived. We shall obtain a nxn weight matrix, W nxn.

朝陽科技大學 李麗華 教授 7 HNN Recall Process Recall a. Read the nxn weight matrix, W nxn. b. Input the test pattern X for recalling. c. Compute new input ( i.e. output ) d. Repeat process c. until the network converge ( i.e. the net value is not changed or the error is very small ) +1 net j > 0 X j old if net j = 0 +1 net j < 0 X j : ( or net = W ‧ X i ) X new

朝陽科技大學 李麗華 教授 8 Example: Use HNN to memorize patterns (1) Use HNN to memorize the following patterns. Let the Green color is represented by “1” and white color is represented by “-1”. The input data is as shown in the table PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X X2X X3X X4X4 X1X1 X2X2 X3X3 X4X4

朝陽科技大學 李麗華 教授 9 W ii =0 PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X X2X X3X X4X4 Example: Use HNN to memorize patterns (2)

朝陽科技大學 李麗華 教授 10 Recall Example: Use HNN to memorize patterns (3) The pattern is recalled as:

-Artificial Neural Network- Bidirectional Associative Memory (BAM) 朝陽科技大學 資訊管理系 李麗華 教授

朝陽科技大學 李麗華 教授 12 Introduction Bidirectional Associative Memory (BAM) was proposed by Bart Kosko in It is a hetro-associative memory network. It allows the network to memorize from a set of pattern X p to recall another set of pattern Y p Y1Y1 Y2Y2 YmYm ‧‧‧‧‧‧ ‧‧‧‧‧‧‧

朝陽科技大學 李麗華 教授 13 Assoicative Memory (AM) 1 Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns. Two types of AM: –Auto-associative Memory: Converts a corrupted input pattern into the most resembled input. –Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.

朝陽科技大學 李麗華 教授 14 Models: It is the associative mapping of an input vector X into the output vector V. EX: Hopfield Neural Network (HNN) EX: Bidirectional Associative Memory (BAM) Assoicative Memory (AM) 2 Assoicative Memory X1X2X3:XnX1X2X3:Xn v1v2v3:vmv1v2v3:vm

朝陽科技大學 李麗華 教授 15 BAM Architecture  Input layer : ‚ Output layer : ƒ Weights : „Connection : Y1Y1 Y2Y2 YmYm ‧‧‧‧‧‧ ‧‧‧‧‧‧‧ It’s a 2-layer, fully connected, feed forward & feed back network.

朝陽科技大學 李麗華 教授 16 BAM Architecture (cont.) …Transfer function :

朝陽科技大學 李麗華 教授 17 BAM Example(1/4) Test pattern Y1Y1 Y 2 Y3Y3 Y4Y ●●● ○●○ ● ○●○● ○● ○●○● ○ ○●○●○●○●○●○● ● ●● ○ ○○

朝陽科技大學 李麗華 教授 18 BAM Example(2/4) 1. Learning –Set up network –Setup weights Y1Y1 Y 2 Y3Y3 Y4Y

朝陽科技大學 李麗華 教授 19 BAM Example(3/4) 2. Recall Read network weights ‚Read test pattern ƒCompute Y „Compute X …Repeat (3) & (4) until converge

朝陽科技大學 李麗華 教授 20 BAM Example(4/4) 聚類之 Application (1) (2) 二次都相同 test pattern ( ) 1*6  ●●● ○●○