1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
Introduction to Neural Networks Computing
G53MLE | Machine Learning | Dr Guoping Qiu
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks Basic concepts ArchitectureOperation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Chapter 6: Multilayer Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 29 Nov 11, 2005 Nanjing University of Science & Technology.
CS621 : Artificial Intelligence
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Chapter 2 Single Layer Feedforward Networks
EEE502 Pattern Recognition
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 28 Nov 9, 2005 Nanjing University of Science & Technology.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Prof. Carolina Ruiz Department of Computer Science
Data Mining with Neural Networks (HK: Chapter 7.5)
Pattern Recognition: Statistical and Neural
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Artificial Intelligence Chapter 3 Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Artificial Intelligence Chapter 3 Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Recognition: Statistical and Neural
Artificial Intelligence Chapter 3 Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology

2 Lecture 21 Topics 1.Example – Analysis of simple Neural Network 2.Example - Synthesis of special forms of Artificial Neural Networks 3. General concepts of Training an Artificial Neural Network- Supervised and unsupervised,training sets 4. Neural Networks Nomenclature and Notation 5. Derivation and Description of the Backpropagation Algorithm for Feedforward Neural Networks

3 Example: Analyze the following Neural Network

4 Solution: Outputs of layer 1 ANEs

5 Output of layer 2 ANE is Thus from layer 1 we have - 2 ≥ 0 < 0

6

7 Final Solution: Output Function for Given Neural Network

8 Example: Synthesize a Neural Network Given the following decision regions build a neural network to perform the classification process Solution: Use Hyperplane-AND-OR structure

9 Each g k (x) specifies a hyperplane boundary

10 Hyperplane LayerAND LayerOR Layer all f(·) = μ (·) Solution:

11 Training a Neural Network “With a teacher” “Without a teacher”

12

13 Training Set x j are the training samples d j is the class assigned to training sample x j

14 Example of a training set: ( x 1 = [ 0, 1,2 ] T, d 1 = C 1 ), ( x 2 = [ 0, 1,0 ] T, d 2 = C 1 ), ( x 3 = [ 0, 1,1 ] T, d 3 = C 1 ), ( x 4 = [ 1, 0,2 ] T, d 4 = C 2 ), ( x 5 = [ 1, 0,3 ] T, d 5 = C 2 ), ( x 6 = [ 0, 0,1 ] T, d 6 = C 3 ), ( x 7 = [ 0, 0,2 ] T, d 7 = C 3 ) ( x 8 = [ 0, 0,3 ] T d 8 = C 3 ) ( x 9 = [ 0, 0,3 ] T d 9 = C 3 ) ( x 10 = [ 1, 1,0 ] T d 10 = C 4 ) ( x 11 = [ 2, 2,0 ] T d 11 = C 4 ) ( x 12 = [ 2, 2,2 ] T d 12 = C 5 ) ( x 13 = [ 3, 2, 2 ] T d 13 = C 6 ) { }

15 General Weight Update Algorithm x(k) is the training sample for the k th iteration d(k) is the class assigned to training sample x(k) y(k) is the output vector for the k th training sample

16 Training with a Teacher( Supervised) 1. Given a set of N ordered samples with their known class assignments. 2. Randomly select all weights in the neural network. 3. For each successive sample in the total set of samples, evaluate the output. 4. Use these outputs and the input sample to update the weights 5. Stop at some predetermined number of iterations or if given performance measure is satisfied. If not stopped go to step 3

17 Training without a Teacher( Unsupervised) 1. Given a set of N ordered samples with unknown class assignments. 2. Randomly select all weights in the neural network. 3. For each successive sample in the total set of samples, evaluate the outputs. 4. Using these outputs and the inputs update the weights 5. If weights do not change significantly stop with that result. If weights change return to step 3

18 Supervised Training of a Feedforward Neural Network Nomenclature

19 Output vector of layer m Output vector of layer L Node Number Layer m Node Number Layer L 1

20 Weight Matrix for layer m Node 1 Node 2 Node N m N NmNm

21 fix Layers, Nets, Outputs, Nonlinearities

22 Define the performance E p for sample x(p) as We wish to select weights so that E p is Minimized – Use Gradient Algorithm

23 Gradient Algorithm for Updating the weights p w(p)w(p) p x(p)x(p)

24 Derivation of weight update equation for Last Layer (Rule #1) Backpropagation Algorihm The partial of y m (L) with respect to w kj (L) is

25 General Rule #1 for Weight Update Therefore

26 Derivation of weight update equation for Next to Last Layer (L-1) Backpropagation Algorithm

27

28 General Rule #2 for Weight Update - Layer L-1 Backpropagation Algorithm Therefore and the weight correction is as follows

29 where weight correction (general Rule #2) is w (L-1)

30 Backpropagation Training Algorithm for Feedforward Neural networks

31 Input pattern sample x k

32 Calculate Outputs First Layer

33 Calculate Outputs Second Layer

34 Calculate Outputs Last Layer

35 Check Performance E TOTAL (p)  ½  (d[x(p-i)] – f( w T (p-i)  x(p-i) ) 2 i = 0 N s - 1 E TOTAL (p+1) = E TOTAL (p) + E p+1 (p+1) – E p-Ns (p-N s ) Single Sample Error Over all Samples Error Can be computed recursively

36 Change Weights Last Layer using Rule #1

37 Change Weights previous Layer using Rule #2

38 Change Weights previous Layer using Modified Rule #2

39 Input pattern sample x k+1 Continue Iterations Until

40 Repeat process until performance is satisfied or maximum number of iterations are reached. If performance not satisfied at maximum number of iterations the algorithm stops and NO design is obtained. If performance is satisfied then the current weights and structure provide the required design.

41 Freeze Weights to get Acceptable Neural Net Design

42 Backpropagation Algorithm for Training Feedforward Artificial Neural Networks

43 Summary Lecture 21 1.Example – Analysis of simple Neural Network 2.Example - Synthesis of special forms of Artificial Neural Networks 3. General concepts of Training an Artificial Neural Network- Supervised and unsupervised,and description of training sets 4. Neural Networks Nomenclature and Notation 5. Derivation and Description of the Backpropagation Algorithm for Feedforward Neural Networks

44 End of Lecture 21