Pattern Recognition: Statistical and Neural

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
G53MLE | Machine Learning | Dr Guoping Qiu
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Chapter 4 (part 2): Non-Parametric Classification
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
CS532 Neural Networks Dr. Anwar Majid Mirza Lecture No. 3 Week2, January 22 nd, 2008 National University of Computer and Emerging.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Explorations in Neural Networks Tianhui Cai Period 3.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Multi-Layer Perceptron
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
EEE502 Pattern Recognition
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 3 Sept 16, 2005 Nanjing University of Science & Technology.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Multinomial Regression and the Softmax Activation Function Gary Cottrell.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Multiple-Layer Networks and Backpropagation Algorithms
The Gradient Descent Algorithm
ANN-based program for Tablet PC character recognition
Chapter 2 Single Layer Feedforward Networks
CS623: Introduction to Computing with Neural Nets (lecture-5)
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Classification / Regression Neural Networks 2
CS621: Artificial Intelligence
Prof. Carolina Ruiz Department of Computer Science
Hebb and Perceptron.
Pattern Recognition: Statistical and Neural
Artificial Neural Network & Backpropagation Algorithm
network of simple neuron-like computing elements
Artificial Neural Networks
Neural Network - 2 Mayank Vatsa
Artificial Neural Network
CSSE463: Image Recognition Day 17
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CS623: Introduction to Computing with Neural Nets (lecture-5)
CSSE463: Image Recognition Day 17
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Learning Combinational Logic
Artificial Neural Networks / Spring 2002
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Pattern Recognition: Statistical and Neural Nanjing University of Science & Technology Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 22 Oct 28, 2005

Lecture 22 Topics Review Backpropagation Algorithm Weight Update Rules 1 and 2 for Logistic and Tanh Activation Functions Output Structure for Neural Net Classifiers Single,Multiple and Coded output nodes 4. Words of Wisdom 5. Overall Design and Testing Methodology

Back Propagation Algorithm for Training a Feedforward neural Network

Input pattern sample xk

Calculate Outputs First Layer

Calculate Outputs Second Layer

Calculate Outputs Last Layer

ETOTAL(p)  ½  (d[x(p-i)] – f( wT(p-i)x(p-i) )2 i = 0 Check Performance Single Sample Error Over all Samples Error Ns - 1 ETOTAL(p)  ½  (d[x(p-i)] – f( wT(p-i)x(p-i) )2 i = 0 Can be computed recursively ETOTAL(p+1) = ETOTAL(p) + Ep+1 (p+1) – Ep-Ns (p-Ns )

Change Weights Last Layer using Rule #1

Change Weights previous Layer using Rule #2

Change Weights previous Layer using Modified Rule #2

Input pattern sample xk+1 Continue Iterations Until

Repeat process until performance is satisfied or maximum number of iterations are reached. If performance not satisfied at maximum number of iterations the algorithm stops and NO design is obtained. If performance is satisfied then the current weights and structure provide the required design.

Freeze Weights to get Acceptable Neural Net Design

General Rule #1 for Weight Update Therefore

General Rule #2 for Weight Update- Layer L-1 Therefore and the weight correction is as follows

where weight correction (general Rule #2) is

Specific Rules for Given Activation Functions 1. Rule #1 for Logistic Activation Function 2. Rule #2 for Logistic Activation Function 3. Rule #1 for Tanh Activation Function 4. Rule #2 for Tanh Activation Function

Rule #1 for Logistic Activation Function Lth Layer Weight Update Equation

Rule #2 for Logistic Activation Function w (L-1)th Layer Weight Correction Equation ) = where

Rule #1 for Tanh Activation Function Lth Layer Weight Update Equation

Rule #2 for Tanh Activation Function (L-1)th Layer Weight Correction Equation = where

Selection of Output Structure for Classifier Design (a). Single Output Node (b) N output nodes for N classes (c) Log2 N output Coded nodes

(a) Single Output Node Example four classes with one output node

ti selected as center of Ri (a) Single Output Node K class case- one output neuron ti selected as center of Ri

(b) Ouput Node for Each Class Example four classes with one output node 1. Select Class Cj if yj is the biggest 2. Select Class Cj if (y1,y2,y3,y4) is closest to target vector for Class Cj Possible Decision Rules

(b) Ouput Node for Each Class

(c) Binary Coded Log2NC Output Nodes   (c) Binary Coded Log2NC Output Nodes Example four classes with two output nodes

(c) Binary Coded Log2NC Output Nodes

Words of Wisdom It is better to break a big problem down into several sub problems than to try to find a single large neural net that will perform the classification process. Example: Design a neural net to classify letters from different fonts into individual letter classes. Assume that there are 26 classes representing by the letters: S = { a,b,c,d,e,f,g,h,I,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z }

Solution: Design a neural net( Neural Net 1) to separate classes A1, A2, A3, and A4 ; then design four neural networks to break these classes single letters.

on Training Set

Motivation for Momentum Correction !

Momentum Correction for Backpropagation Weight update equation

Summary Lecture 22 Reviewed Backpropagation Algorithm Presented Weight Update Rules 1 and 2 for Logistic and Tanh Activation Functions Gave Output Structure for Neural Net Classifiers Single,Multiple and Coded output nodes 4. Spoke some Words of Wisdom 5. Presented an Overall Design and Testing Methodology

End of Lecture 22