Download presentation
Presentation is loading. Please wait.
Published bySudirman Tanudjaja Modified over 5 years ago
1
Pattern Recognition: Statistical and Neural
Nanjing University of Science & Technology Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 22 Oct 28, 2005
2
Lecture 22 Topics Review Backpropagation Algorithm
Weight Update Rules 1 and 2 for Logistic and Tanh Activation Functions Output Structure for Neural Net Classifiers Single,Multiple and Coded output nodes 4. Words of Wisdom 5. Overall Design and Testing Methodology
3
Back Propagation Algorithm for Training a Feedforward neural Network
4
Input pattern sample xk
5
Calculate Outputs First Layer
6
Calculate Outputs Second Layer
7
Calculate Outputs Last Layer
8
ETOTAL(p) ½ (d[x(p-i)] – f( wT(p-i)x(p-i) )2 i = 0
Check Performance Single Sample Error Over all Samples Error Ns - 1 ETOTAL(p) ½ (d[x(p-i)] – f( wT(p-i)x(p-i) )2 i = 0 Can be computed recursively ETOTAL(p+1) = ETOTAL(p) + Ep+1 (p+1) – Ep-Ns (p-Ns )
9
Change Weights Last Layer using Rule #1
10
Change Weights previous Layer using Rule #2
11
Change Weights previous Layer using Modified Rule #2
12
Input pattern sample xk+1
Continue Iterations Until
13
Repeat process until performance is satisfied or maximum number of iterations are reached.
If performance not satisfied at maximum number of iterations the algorithm stops and NO design is obtained. If performance is satisfied then the current weights and structure provide the required design.
14
Freeze Weights to get Acceptable Neural Net Design
15
General Rule #1 for Weight Update
Therefore
16
General Rule #2 for Weight Update- Layer L-1
Therefore and the weight correction is as follows
17
where weight correction (general Rule #2) is
18
Specific Rules for Given Activation Functions
1. Rule #1 for Logistic Activation Function 2. Rule #2 for Logistic Activation Function 3. Rule #1 for Tanh Activation Function 4. Rule #2 for Tanh Activation Function
19
Rule #1 for Logistic Activation Function
Lth Layer Weight Update Equation
20
Rule #2 for Logistic Activation Function
w (L-1)th Layer Weight Correction Equation ) = where
21
Rule #1 for Tanh Activation Function
Lth Layer Weight Update Equation
22
Rule #2 for Tanh Activation Function
(L-1)th Layer Weight Correction Equation = where
23
Selection of Output Structure for Classifier Design
(a). Single Output Node (b) N output nodes for N classes (c) Log2 N output Coded nodes
24
(a) Single Output Node Example four classes with one output node
25
ti selected as center of Ri
(a) Single Output Node K class case- one output neuron ti selected as center of Ri
26
(b) Ouput Node for Each Class
Example four classes with one output node 1. Select Class Cj if yj is the biggest 2. Select Class Cj if (y1,y2,y3,y4) is closest to target vector for Class Cj Possible Decision Rules
27
(b) Ouput Node for Each Class
28
(c) Binary Coded Log2NC Output Nodes
(c) Binary Coded Log2NC Output Nodes Example four classes with two output nodes
29
(c) Binary Coded Log2NC Output Nodes
30
Words of Wisdom It is better to break a big problem down into several sub problems than to try to find a single large neural net that will perform the classification process. Example: Design a neural net to classify letters from different fonts into individual letter classes. Assume that there are 26 classes representing by the letters: S = { a,b,c,d,e,f,g,h,I,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z }
31
Solution: Design a neural net( Neural Net 1) to separate classes A1, A2, A3, and A4 ; then design four neural networks to break these classes single letters.
32
on Training Set
33
Motivation for Momentum Correction !
34
Momentum Correction for Backpropagation Weight update equation
35
Summary Lecture 22 Reviewed Backpropagation Algorithm
Presented Weight Update Rules 1 and 2 for Logistic and Tanh Activation Functions Gave Output Structure for Neural Net Classifiers Single,Multiple and Coded output nodes 4. Spoke some Words of Wisdom 5. Presented an Overall Design and Testing Methodology
36
End of Lecture 22
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.